In order to export your SEO platform data into Google's Big Query service, so you can run complex SQL queries on large datasets, or use the data as a basis for a dashboard you're creating in Google Data Studio or other visualization tools, you will need to complete the following steps.
We will shortly be publishing a series of powerful Google Data Studio reports which utilize these Big Query datasets to help you build and customize a beautiful set of SEO reports in no time at all. The first iteration of these reports will be published in early June 2021.
What types of SEO platform data can be exported into Big Query?
We have recently added a new Big Query integration option which allows you to export the following data from your platform projects into a single, dedicated Big Query project that you have created following the instructions set out below.
-
Keyword Ranking Data (including tags)
-
Share of Search
-
Google Search Console
-
Google Analytics
-
Website crawl
-
Backlink data
How often is the data updated in Big Query?
Every time a new event is created on the platform then we will try and sync this activity as soon as possible with Big Query. This means that if you create Google Data Studio reports or other visualizations using the data exported from the platform you can be assured that the data is up-to-date.
Getting Started - How do I create a Big Query Project and Service Key?
N/B: You only need to create one Big Query project and Service Key!
We will send all your platform data to this single project - but each project will be in a separate data set with identical file names to make it easy to copy a single Google Data Studio report and connect to the Big Query data you want. Each platform project will have a folder with the following files, as illustrated below.
The data structure in your Big Query instance will be like this:
Big Query Project > Data Set > Tables

(You are responsible for paying your own Google Cloud Platform costs. At the time of writing, the first 1 TB of data processed with BigQuery each month is free - but please check Google's latest pricing for Google Big Query).
1) Login to your Google Console
-
Login to your Google account and then go to https://console.cloud.google.com/
2) Create a Service Account

-
Go to IAM & Admin > Service Accounts and follow the wizard
2 a)
-
Just provide a name and (optionally) a description. Google will automatically create the Service account ID:

2 b)
-
Assigned the service account you've just created a role of 'BigQuery user':

2 c) Grant users access to this service account
-
This is an optional step. You can just click on the DONE button here.
3) Create a key file
-
From the Service Accounts view (IAM & Admin > Service Accounts), click on the Actions column on the right-hand side and select 'Create key'.

This will ask if you want to create a JSON file or a P12 file:

-
Leave 'JSON' selected and click on 'CREATE'. This will download a JSON file to your machine.
-
If you now click on IAM & Admin > IAM now, you should see your service account added as a Member with the 'BigQuery User' role assigned to it (if you don't see this (highlighted in the orange box below), you will need to manually add a Member and ensure the Member name is the same as the service key and the 'BigQuery User' role is assigned to this member):

4) Add Big Query integration to one or more projects in your account
-
Login, go to Settings > Integrations and click on 'Google Big Query':

Select one or more projects to sync with Big Query:

5) Upload the JSON file
Finally, just upload the JSON file you created in Step 3. The platform will then send over project data to Big Query. You will then be able to use a template in Google Data Studio and populate it using your data in your Big Query instance by changing the data source for the template.
Comments
0 comments
Please sign in to leave a comment.