As a base requirement it needs to be checked which Google Analytics Property is used in order to select the right data process setup on the side of Adtriba.

At the moment Adtriba supports the following GA Properties and export schemas:

There are two methods of delivering the data:

  • Adtriba reads the data directly from BigQuery (Preferred method)

  • The customer delivers the data in a dedicated S3 bucket

OPTION I - Authorize Adtriba to read the BigQuery dataset with GA data

This is our preferred way of accessing the GA data.

  1. Adtriba will provide the customer with a Google Cloud service account

  2. The service account needs to be added as a BigQuery viewer to the GA 360 dataset

  3. The customer sends the following information:

    1. the Google Cloud project id where GA BigQuery dataset is located

    2. the GA dataset name

    3. the GA table name (usually ga_sessions or events, depending on which property is being used)

    4. the region of the GA dataset

OPTION II - The GA data is delivered on daily basis to a dedicated S3 bucket

Based on the used GA Property a daily data dump with the following specification needs to be placed in a S3 bucket which Adtriba will provide.

Integration of Universal Analytics data

  • Export the latest and full ga_sessions_YYYYMMDD table once per day

  • Data format must be JSON

  • Data files must be compressed as GZIP

  • Data can be split into multiple files for optimal performance on both sides

  • Upload it to the Adtriba S3 bucket

Destination

s3://{BUCKET}/tracking/ga360_ua/dataset={DATASET}/ga_sessions={YYYYMMDD}/ 
  • The variables {BUCKET} and {YYYYMMDD} must be replaced

  • Adtriba will define the {BUCKET}

  • The value of {DATASET} is the name of the BigQuery dataset where the tables are located, this is provided by the customer

  • The customer/data uploader must replace {YYYYMMDD} with the date which is used in the name of the export table

Option 1: JSON & one file

s3://{BUCKET}/tracking/ga360_ua/dataset=123456/ga_sessions=20210201/data.json.gz

Option 2: JSON & multiple files

s3://{BUCKET}/tracking/ga360_ua/dataset=123456/ga_sessions=20210201/data_000.json.gz
s3://{BUCKET}/tracking/ga360_ua/dataset=123456/ga_sessions=20210201/data_001.json.gz
s3://{BUCKET}/tracking/ga360_ua/dataset=123456/ga_sessions=20210201/data_002.json.gz
...

Implementation of Google Analytics 4 data

  • Export the latest and full ga_sessions_YYYYMMDD table once per day

  • Data format must be JSON

  • Data files must be compressed as GZIP

  • Data can be split into multiple files for optimal performance on both sides

  • Upload it to the Adtriba S3 bucket

Destination

s3://{BUCKET}/tracking/ga360_ga4/dataset={DATASET}/events={YYYYMMDD}/ 
  • The variables {BUCKET} and {YYYYMMDD} must be replaced

  • Adtriba will define the {BUCKET}

  • The value of {DATASET} is the name of the BigQuery dataset where the tables are located

  • The customer/data uploader must replace {YYYYMMDD} with the date which is used in the name of the export table

Option 1: JSON & one file

s3://{BUCKET}/tracking/ga360_ga4/dataset=123456/events=20210201/data.json.gz

Option 2: JSON & multiple files

s3://{BUCKET}/tracking/ga360_ga4/dataset=123456/events=20210201/data_000.json.gz
s3://{BUCKET}/tracking/ga360_ga4/dataset=123456/events=20210201/data_001.json.gz
s3://{BUCKET}/tracking/ga360_ga4/dataset=123456/events=20210201/data_002.json.gz
...

Did this answer your question?