How to use Search Console bulk data export

Google Search Advocate Daniel Waisberg recently presented an in-depth video on Bulk Data Exports, a feature that allows you to export, store, and analyze data from Search Console.

This new solution exceeds the capabilities and makes managing large volumes of data very easy.

Here’s how.

An overview of current data export solutions

Before introducing the bulk data export feature, Waisberg summarized the existing methods for exporting Search Console data.

The most accessible way is through the user interface. You can directly export up to 1000 rows of data with a simple click of the export button.

Looker Studio and the API provide solutions for people who need larger volumes of data. Both channels allow you to retrieve performance data, URL inspection data, sitemaps, and site data, with an export limit of up to 50,000 rows.

Introducing bulk data export

The final and most advanced method of exporting data from Search Console is bulk data export.

This unique feature allows you to extract large amounts of data using Google BigQuery without row limits. This is beneficial for large websites with numerous pages or extensive traffic.

Waisberg states, “A bulk data export is a scheduled daily export of Search Console performance data. It includes all the data used by Search Console to generate performance reports. The data is exported to Google BigQuery, where you can run SQL queries for advanced data analysis or even export to another system.”

Bulk data export settings

Given its complexity and power, bulk data export requires existing knowledge of Google Cloud Platform, BigQuery, and Search Console.

Please note that there may be costs involved in taking advantage of this tool, so it is crucial to consider the possible charges before setting up a new export.

Setting up a bulk data export involves Google Cloud and Search Console.

First step: Google Cloud

First, switch to the relevant project in Google Cloud and ensure that the BigQuery API is enabled.

Open your Google Cloud console and switch to the project you’re exporting data to. Go to APIs and Services > Enabled APIs and Services and enable the BigQuery API if it is not already enabled. Go to IAM and Admin, click + GRANT ACCESS and paste it search-console-data-export@system.gserviceaccount.com in New Principals. Grant this account two roles: BigQuery Job User and BigQuery Data Editor, then Save.

Second step: Search Console

In Search Console, follow these steps:

Go to Settings > Bulk Data Export. Enter your Google Cloud Project ID in the Cloud Project ID field. Choose a dataset name. The default value is “searchconsole”. Select a location for your dataset. This cannot be easily changed later. Click Continue to start the exports. The first export will occur up to 48 hours after successful configuration. After the table is created, set a partition expiration if necessary, but avoid schema alterations. For historical data prior to initial setup, use the Search Console API or reports.

Supervision and management of data exports

The new data export system has a built-in feature that allows you to monitor data exports using BigQuery. For example, you can track exports using an export log table.

Note that data will accumulate indefinitely unless you set an expiration time. The export process will continue until manually disabled or if Search Console experiences problems.

In the event of an error, Search Console will notify all owners.

To sum up

In conclusion, the bulk data export feature can improve the way you manage large amounts of Search Console data.

Stay tuned for upcoming content from Google that will delve into data processing after configuring the export and best practices for extracting data from BigQuery.

source: YouTube

Featured image generated by the author via Midjourney.



Source link

You May Also Like

About the Author: Ted Simmons

I follow and report the current news trends on Google news.

Leave a Reply

Your email address will not be published. Required fields are marked *