Another deadline for the Google Analytics 4 migration project is fast approaching, and this deadline is hard to pin down. On July 1st, Google will delete all historical data from Universal Analytics properties. This term also applies to Analytics 360 customers.
With just over a month until the deadline, if you haven’t already done so, your organization needs to prioritize archiving your historical data. There are three main phases that I recommend for approaching this project.
Phase 1: Make a plan
Before archiving data, it is important to decide:
What specific data is important to you?
Prioritize downloading data you refer to frequently, such as conversion and sales data. Make a complete list of the data you need to archive.
How many years of data do you want to keep?
Many of us have been using Google Analytics since the mid-2000s; Does your organization need to archive data from nearly 20 years ago? Decide how far you want to archive the data. I recommend at least considering the archive to 2018 or so to make sure you have pre-pandemic data, as the pandemic really presented data anomalies for many companies.
How often do you review the data?
Consider how often you typically report your data. is it weekly Monthly? Depending on the archiving method you choose in Phase 2, you may need to organize your data into specific time increments.
Get the daily search newsletter marketers trust.
Phase 2: Choose an archive method
There are three main options available for archiving your Universal Analytics data. Each has its pros and cons, so choose a method based on your team’s resources and abilities.
Option 1: Download files manually
Pros: Easy to do for almost all users, free
Cons: Time consuming, cumbersome, hard to access data to report later, limited to 5000 rows
Although this is the easiest process to understand, it also takes time.
Following your plan for years, cadence, and data points, you’ll need to go to each report in the Google Universal Analytics interface, set the date, dimension, and metric settings as needed.
Also, remember to change the number of rows from the default value of 10 to a maximum of 5,000 rows to ensure you capture as much data as possible.
Click the export button and export the data to a Google Spreadsheet, Excel or CSV. Repeat this process until you have downloaded all the data identified in your archive plan.
Option 2: Download data to Google Sheets using the Google Analytics add-on (best option for tech newbies)
Pros: Fairly simple to implement for most users with spreadsheet experience, free and quick to download.
Cons: By restricting to a defined time period (eg monthly), each sheet has total data limitations, often encountering sampling issues.
This option is quite simple for most users. Create a new Google Sheet and add to it Google Analytics Spreadsheet Add-on.
The plugin basically uses the Google Analytics API to pull data into Google Sheets, but doesn’t require any API programming knowledge to work. Google has compiled a basic overview of this approach help document.
The first time you use the plugin, you’ll create a report using the plugin interface. But once the first report has run, you can also refresh the Report Settings tab and create additional reports directly on the columns of this sheet.
You can also conveniently use formulas in the report settings sheet. Use the Dimensions and metrics explorer to find the appropriate API code to enter in each field.
A downside to the Google Sheets method is that you can encounter sampling if you get too much data at once (eg your entire 20-year data set for sessions) or your report is too detailed (too dimensions aggregated for a high level of granularity).
When you run a report, you’ll see the sampling level on the report’s data tab in cell B6. If the report contains sampled data, you may want to consider reducing the amount of data in that particular extract, for example, you might split the extract into two time periods.
However, if you cannot avoid sampling, check the data sample percentage in the report. Then, on the Report Settings tab, display rows 14-17 and row 15 sample size at this level so that your data is consistent.
advice: The plugin has 1000 lines of data by default in a report. Simply delete the 1,000 under the line labeled “Limit” (usually row 11).
Another downside to the Google Sheets option is that each file is limited to 10,000,000 cells. Typically, each sheet starts with 26 columns (A to Z) and 1,000 default rows (or 26,000 cells).
If your downloaded data exceeds the 10,000,000 cell limitation (which is very likely to happen), you may need to have multiple Google Sheets to download all the data.
Option 3: Download data using the Google Analytics API
Pros: Extract data quickly once configured
Cons: Requires web development knowledge and resources, does not solve data sampling problem, API quota limitations
If you have web development resources that can work on the archive project, they can pull the detailed data in your plan using the Google Analytics API directly.
This works similarly to the Google Sheets add-in option mentioned above, but is a more manual process for scheduling API calls.
For information on how to use the API for this project, visit Google archive information page and review the second bullet, which details various resources and considerations for using the API for this data export project.
Option 4: Download data to BigQuery (the best option overall)
Pros: Easy to access data later for reporting, more insight into data, more flexible for data
Cons: Complicated for novices to set up initially, may involve fees for BiqQuery, may require technical resources to set up, need to involve an additional tool
The main advantage of storing your Universal Analytics data in BigQuery is that BigQuery is a data warehouse that allows you to ask questions about the dataset using SQL queries to get your data very quickly. This is especially useful for accessing this data for reporting later.
Analytics 360 users
If you’re an Analytics 360 user, Google offers a native export to BigQuery. I recommend this method. See Google’s instructions.
All others
If you’re not an Analytics 360 user, you’ll need to approach BigQuery backup differently because Google it doesn’t provide native BigQuery backup options in Universal Analytics for non-360 users.
Here are the steps you’ll want to follow:
Step 1: Create a Google API Console project and enable BigQuery. Log in to Google APIs Console. Create a Google API Console project. Go to the API table. Enable BigQuery.
Step 2: Prepare your project for BigQuery export. Make sure billing is enabled for your project. You may not have to pay anything, but it will vary based on your usage and data. If prompted, create a billing account. Accept the free trial if available. Validate billing enablement. Open your project in https://console.cloud.google.com/bigquery, and try to create a dataset in the project. Click the blue arrow next to the project name, then click Create Dataset. If you can create the dataset, billing is set up correctly. If there’s an error, make sure billing is turned on. Add the service account to your project. add [email protected] as a member of the project and make sure the project-level permission is set to Editor (instead of BigQuery Data Editor). The Editor feature is required to export Analytics data to BigQuery. If you’re in the EU, too review additional requirements.
Step 3: Set up a free trial of Supermetrics. Similar to the Google Sheets add-on in option 2 above, Supermetrics is a tool that helps non-technical users interact with and use APIs. They offer a 14-day free trial, which is probably all you need for this project, since you’re only downloading Universal Analytics data once (not regularly). Connect your BigQuery data source to the Supermetrics dashboard.
Step 4: In BigQuery, connect to Supermetrics. Go to BigQuery and then to Data Transfers. Click + Create Transfer. Select your Supermetrics Google Analytics as the source and click Sign Up. Fill in the transfer details. See detailed instructions on how to set up a transfer. Under Third Party Connection, click Connect Source. Accept the deal. Click Authorize with your Google data source. Click Sign in with Google. Sign in with the Google Account you use with this data source. This does not have to be the same Google Account you use with Supermetrics. Click Allow. Select the accounts you want to include in your reports and define the transfer settings. Click Submit. Click Save.
Since you only need to transfer Universal Analytics data once, you can also change the transfer schedule to On Demand and then run the transfer now.
Phase 3: Make sure you’ve captured everything
Before you consider the project complete, be sure to check the archived data to make sure you’ve captured everything you planned to archive.
On July 1st, you will no longer be able to access Universal Analytics data, either through the API or the interface.
The opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.
[ad_2]
Source link