Import Google Analytics data into Adobe Analytics using Data Sources

On one hand, Adobe Analytics remains my favorite web analytics tool on the market. The longer I use it, the more I appreciate all the well thought-out features, from data collection to processing, storage, and analysis. Those features are even more impressive when compared with what Google Analytics has to offer.

And yet, on the other hand, even I can’t avoid having to work with Google Analytics in some way or another. In a large, global company, it is basically unavoidable to find Google Analytics on some small, long forgotten marketing landing page in some market. It gets even worse: Up until last year, I personally had to maintain an inherited Google Analytics instance on a legacy website and app. What a cruel world!

Besides those cases, where someone in your company actually wants to use Google Analytics, there are also more forgivable cases. For example, a company may be in the process of migrating from Google Analytics to Adobe Analytics and may end up having implemented both tools in parallel for a short time. Those companies may also be interested in not losing their historical data, while obviously wanting to start using the new and better tool as soon as possible.

A widely adopted solution for this is to use tools like Power BI or other business intelligence tools to query both Google Analytics and Adobe Analytics for their data. There is nothing wrong with that approach for larger business reporting cases, where Power BI is used already, but comes with quite a lot of overhead when you are simply looking for a way to see your (old) Google Analytics data next to the shiny, new Adobe Analytics data.

Luckily Adobe provides us with an easy solution to this challenge: Data Sources in Adobe Analytics give us a free and easy way to bring in summary-level data and metrics from Google Analytics (or any other tool, even outside of web analytics) and use it in Analysis Workspace and Report Builder. You may remember my previous post on how this can be used with native Google Search keywords through the Google Search Console.

While this may be the approach for this post, it’s not even the only way to bring Google Analytics’ data into the Adobe World. For example, we could (probably?) use Google’s Big Query to export actual row-level event data and use either Adobe Analytics’ insertion APIs or even use Adobe’s Experience Platform to import it into Customer Journey Analytics. I’ll maybe save that for another post. Let’s start!

Step 1: Decide on approach

Before we start to implement anything, we need to make up our mind about which scale this project should be. With Data Sources, we could connect as many systems to Adobe Analytics as we want, so we need to decide if we are going for a generalized approach (which would work for any kind of data) or a more specific approach (which would only make sense for Google Analytics data). What do I mean by this? I’m glad you asked.

Imagine we want to import only User and Session numbers from Google Analytics. To achieve this, we would need at least one eVar and Success Event to be able to import data. If we follow a very Google Analytics-specific approach, we would name the eVar something like “Google Analytics Property” and set up as many Success Events as needed, so two in this case, named “Google Analytics Users” and “Google Analytics Sessions”. While this approach is completely feasible (and often used!) we would need to create new eVars and Events for any future system we want to connect to Adobe Analytics!

If we follow a more generalized approach instead, we would actually only need one eVar and Event for any system we could ever want to integrate with in the future. The eVar would be named something like “Imported Metric Name” and the Event maybe “Imported Metric” or “Imported Metric Value”. The fact that we would use those for Google Analytics today would not limit us in the future or require us to change the setup. Neat! This is what we are going to use in this post.

Before we create those two elements, there is one more consideration waiting for us: Technically, we could follow either a manual or automated process. Adobe Analytics can handle data that is manually uploaded to a FTP server or imported through the API. Depending on how long the integration should be in place, and if data should be only back-filled once or continuously imported, you may not even need to code anything! I’ll outline both approaches below.

Step 2: Create Data Source in Adobe Analytics

Depending on which approach you chose, you first need to create the number of eVars and Events you would like to use. To do this, first head to the Report Suites menu in the Admin area of Adobe Analytics:

Finding the Report Suites menu in Adobe Analytics

From the list or Report Suites, select the ones that should receive data and click Edit Settings (1), then Conversion (2), and then first Conversion Variables (3) and afterwards Success Events (4):

Finding eVar and Event settings in Adobe Analytics

In both menus, find a free, unused slot for any of the eVars or Events you need and name them in a descriptive way. I’ve named my eVar “Imported Metric Name” and the Event “Imported Metric”. With those, go to the Data Sources menu by clicking on Admin (1), “All admin”, and then Data Sources (2):

Finding the Data Sources menu in Adobe Analytics

Make sure you have selected the right Report Suite on the top right of the screen. Then, go to Create (1), select the “Generic” category (2) and type (3), and click “Activate” (4). Don’t worry, that will not activate anything straight away:

Creating a new Data Source in Adobe Analytics

This will open a wizard that will take you through the actual creation of the Data Source. Don’t worry about the warning about potential cost, our approach is completely free. You can match the dimensions and metrics you want to use to the Analytics components, but the choices you make in this wizard largely don’t really matter (you can always import other metrics and dimensions afterwards) even if it doesn’t seem that way. The most important page is the last page with the FTP information, which will enable you to manually import data. You can also download the manual upload template on this screen:

Data Sources account information in Adobe Analytics

That’s all we need to do in Adobe Analytics! Now on to the hardest part of this post…

Step 3: Manually Export and format data from Google Analytics

I’ll try to keep this part short, mostly to keep my sanity intact. In Google Analytics, you can export data manually by clicking the top right Export (1) button, then select CSV (2) as format:

Quick, grab your data and get out of there!

This will give you a CSV file. Of course, we still need to put some work in to clean up an extract like this:

Extracted Google Analytics data. Why, Google?!

If you have downloaded the Adobe Analytics template in the previous step, you already know how Adobe would like our data to look like:

Adobe Analytics Data Source Template

First, we need to delete any row with incomplete data in the Google Analytics file (why would they even include it without a date?). Then, remove the thousand separator from the metrics (again, why?!). Pro tip: Use the Regex Replace from Visual Studio Code with a search term like “,([0-9]{3})” and a replacement of “$1”. We also need to change the date format to “m/d/yyyy” (search for “([0-9]{4})([0-9]{2})([0-9]{2})”, replace with “$2/$3/$1”). If you follow my generalized approach, you need to unpivot the rows to only have one metric and descriptor per row. Last step is to replace the separator from commas to tabs, like this:

Finished Adobe Analytics import file

With this, we can now upload the data to Adobe Analytics!

Step 4: Manually upload Data Source files to Adobe Analytics

After you’ve formatted the file from Google Analytics (or any data source you would want!) we’re good to upload it to Adobe Analytics. In my example, I’ve named the data file “import.txt”. To trigger the actual ingestion into Adobe Analytics, we will also need to upload an empty file with the same name but .fin as extension, like “import.fin”. With a tool like FileZilla, it’s super easy to just drag-and-drop the files to the FTP server that we got from Step 2. This is how my uploaded files look like:

Uploaded Data Sources files for Adobe Analytics

After a few moments, those two files will be gone from the directory. If everything went well, the directory is empty and the data is now being ingested. If not, you will see a new folder called “files_with_errors” that now contains the data file and will get an email with information on what went wrong.

Once the processing has finished, your imported data will be available in Analysis Workspace, just like any other metric. If you have followed the general approach of importing data, you could now also create some Calculated Metrics for the imported data:

Google Analytics metrics in Adobe Analytics/Analysis Workspace

If you have the resources to build such a manual process or don’t need frequently updated data, this will give you a lot of value already! The general approach works just as fine for any other data coming from virtually every tool you could think of! But there is even more that we can do…

Step 5: Automated process through Google Analytics and Adobe Analytics APIs

Now maybe instead of manually uploading those files every day, week, or month, you maybe want to invest the time to build an automated import from Google Analytics to Adobe Analytics. As I’ve shown in the previous post about the Google Search Keywords, Adobe Analytics offers the Data Sources API to automate the import. Now all we need is a small script that does the same as before, but first exports Google Analytics data instead of Search Console data. Should be easy, right?

There is at least one nice thing I can say about Google and Google Analytics: They maintain Python packages for their APIs, while we still have to build things ourselves in the Adobe world. I won’t go into all the details on how to set up APIs on Adobe’s or Google’s ends (I’ve used this guide for the Google part, which did not prevent me from accidentally creating a GA4 property… Gah…)

To familiarize myself with the Google Analytics API, I ran a simple trended report for Users and Sessions like this:

SCOPES = ['https://www.googleapis.com/auth/analytics.readonly']
KEY_FILE_LOCATION = 'client_secret.json'
VIEW_ID = '12345678'

credentials = ServiceAccountCredentials.from_json_keyfile_name(KEY_FILE_LOCATION, SCOPES)
analytics = build('analyticsreporting', 'v4', credentials=credentials)
ga_response = analytics.reports().batchGet(
    body={
        'reportRequests': [
        {
            'viewId': VIEW_ID,
            'dateRanges': [{'startDate': '7daysAgo', 'endDate': 'today'}],
            'metrics': [{'expression': 'ga:sessions'},{'expression': 'ga:users'}],
            'dimensions': [{'name': 'ga:date'}]
        }]
    }
    ).execute()

print(ga_response)

This gives me a response like this:

{
    "reports": [
        {
            "columnHeader": {
                "dimensions": [
                    "ga:date"
                ],
                "metricHeader": {
                    "metricHeaderEntries": [
                        {
                            "name": "ga:sessions",
                            "type": "INTEGER"
                        },
                        {
                            "name": "ga:users",
                            "type": "INTEGER"
                        }
                    ]
                }
            },
            "data": {
                "rows": [
                    {
                        "dimensions": [
                            "20211212"
                        ],
                        "metrics": [
                            {
                                "values": [
                                    "1",
                                    "1"
                                ]
                            }
                        ]
                    }
                ],
                "totals": [
                    {
                        "values": [
                            "1",
                            "1"
                        ]
                    }
                ],
                "rowCount": 1,
                "minimums": [
                    {
                        "values": [
                            "0",
                            "0"
                        ]
                    }
                ],
                "maximums": [
                    {
                        "values": [
                            "1",
                            "1"
                        ]
                    }
                ]
            }
        }
    ]
}

Nice! If we want to use the same code that we’ve used in the past, we need to reformat this response a little. I use this function:

result_rows = []

for row in ga_response["reports"][0]["data"]["rows"]:
    i = 0
    for metric in row["metrics"][0]["values"]:
        result_rows.append([row["dimensions"][0][4:6]+"/"+row["dimensions"][0][6:8]+"/"+row["dimensions"][0][0:4]+"/00/00/00",ga_response["reports"][0]["columnHeader"]["metricHeader"]["metricHeaderEntries"][i]["name"],metric])
        i += 1
print(result_rows)

to iterate the result and get it in a generalized format like this:

[["12/12/2021/00/00/00", "ga:sessions", "1"], ["12/12/2021/00/00/00", "ga:users", "1"]]

With this format, we are able to import daily data with Google Analytics’ metric name as the Imported Metric Name and the according value as Imported Metric. Now all that is left to do is create the import job through the Adobe Analytics API, just like before. This time we need a lot less code:

dataSources = requests.post(
        "https://api.omniture.com/admin/1.4/rest/?method=DataSources.Get",
        headers={
            "Authorization": "Bearer {}".format(access_token),
            "x-api-key": config["apiKey"],
            "x-proxy-global-company-id": global_company_id
        }, 
        data={'reportSuiteID':config["report_suite_id"]}
    ).json()

for dataSource in dataSources:
    if dataSource["name"] == "Import Demo":
        dataSourceID = dataSource["id"]
        print("Found Data Source ID")
        break

jobresponse = requests.post(
    "https://api.omniture.com/admin/1.4/rest/?method=DataSources.UploadData",
    headers={
        "Authorization": "Bearer {}".format(access_token),
        "x-api-key": config["apiKey"],
        "x-proxy-global-company-id": global_company_id
    }, 
    json={
        "columns": ['Date', 'Evar 7','Event 13'],
        'reportSuiteID': config["report_suite_id"],
        'dataSourceID':dataSourceID,
        "finished": True,
        "jobName": "Google Analytics import",
        "rows": result_rows
    }
)

This will work as before, you may want to customize the eVar and Event number and name of the Data Source to match your configuration. Once the data has finished processing, it will show up in Analysis Workspace like this:

Manually and automatically uploaded Google Analytics data in Analysis Workspace

Nice! You could decide to either name the metrics differently in the code above or use Classifications in Adobe Analytics to turn something like “ga:users” into a friendly name like “Google Analytics Users”. For this small POC, I’d say we are done!

Wrap up

This was a lot of fun! I hope this post may help you, especially if you are already on Adobe Analytics but have some legacy Google Analytics properties laying around that you don’t really want to touch anymore but somehow need to have data from. After all, even if you still collect data through Google Analytics, you wouldn’t want anyone to touch the interface, right?

Getting this data into Adobe Analytics was super easy. Still, there are some points I’ve found on this journey that I would like Adobe to improve on:

  • While Google Analytics has some Python packages available and maintained, Adobe doesn’t have anything like this. You either have to build things yourself or rely on enthusiasts to build API wrappers.
  • Uploading Data Sources through FTP worked fine for a technical person like me, but a nice web interface (like the Classification uploader) would be super awesome!
  • Customer Journey Analytics has some clear advantages over Adobe Analytics, but until today doesn’t have something similar to Data Sources (summary data that doesn’t increment Events, Sessions, and People). This would be great to have too!

I’m considering to create a dedicated Report Suite for imported data in my actual company, which would get rid of any potential naming collisions. Hopefully this gave you some inspiration too! As always, let me know if you found this post useful. See you next time!

Can Google Analytics data be imported in Adobe Analytics?

Absolutely! Besides importing raw data through Big Query it is very easy to use Data Sources to import metrics completely for free! This makes it very fast and affordable to consolidate data from Google Analytics and Adobe Analytics.

How can I merge data from Google Analytics and Adobe Analytics?

While you could always merge data manually in Excel or automated in Power BI, it is much easier and more comfortable to import Google Analytics data into Adobe Analytics. If you use Adobe Analytics Data Sources it is completely free and retroactive!

Why should I import Google Analytics data into Adobe Analytics?

Consolidating data in Adobe Analytics brings many advantages. It is very comfortable to only use one tool instead of two. In addition to that, you will finally have all your data points in one great Analysis Workspace project in Adobe Analytics. This will make it a lot easier for your stakeholders to pull unified data from all tools on their own.

What helps migrating from Google Analytics to Adobe Analytics

Migrating between two web analytics tools like Google Analytics and Adobe Analytics is a big challenge. Luckily, Data Sources in Adobe Analytics can be used to import historic data from Google Analytics for free! As discussed in this article, the process can be done manually or automatically through the APIs.