Summary Report Suites in Adobe Analytics
Ever since Adobe released Analysis Workspace in 2016, customers were looking for a way to combine data from multiple Report Suites within a single Project in Analysis Workspace. Earlier in 2020 we were finally able to pull data from more than one Report Suite in Workspace, since Report Suites could now be selected on a Panel level instead of for the whole project.
While this feature is awesome and a huge improvement, users like myself still wanted to combine multiple Report Suites in only one Freeform Table within the same Panel. Unfortunately, we still have to invest some work to get a view like this, where the right Table combines data from the two Report Suites on the left:

The screenshot above was taken from a Report Suite which was created to summarize data from other Report Suites. This can be done without any changes to the implementation or additional cost, and with historical data for an unlimited number of source Report Suites. In this post I will describe the process to get such a Summary Report Suite. You can find the complete script on Github.
Builtin Analytics features to summarize data
Before we start implementing our own solution for combining data from other Report Suites, let’s quickly go over the options Adobe normally recommends for this kind of task. Traditionally, we could choose from either setting up a Rollup Report Suite or change our tracking implementation to use a Global Report Suite. Adobe describes both on their documentation, but I will quickly go over those approaches.
First, let’s look at Rollup Report Suites. This is a very old feature that was already available before the times of Analysis Workspace. Rollups allow us to summarize data from up to 40 “real” Report Suites with a daily import job. The data we import is literally what Excel would do if we sum up different Report Suites, just adding the sum of Unique Visitors together without any deduplication. If we add more Report Suites to an existing rollup, historical data would also not be imported. On top of that, only 100 events can be imported, breakdowns or segments are not supported at all. To round things off, they can not be used in Analysis Workspace. It’s a bit legacy.
Second, there are Global Report Suites. This approach requires you to either send your data to two instead of one Report Suite (inflicting additional cost) or send all of your data to only one big Report Suite. This also means historical data could not be added to the Global Report Suite. In Analytics, you could use Segments to differentiate between your websites or products like you would with more granular Report Suites. Based on those Segments, you could also set up some Virtual Report Suites if we don’t need the full feature set of a “real” Report Suite (like the Visit Number dimension).
Adobe would certainly recommend the second approach, since it is the only one that is really supported today. Also, features like Cross Device Analytics can benefit from Global Report Suites, as they include more users and devices for Visitor identification. While this setup certainly is the way to go for completely new Analytics customers, a lot of companies would need to change their implementation. This comes with historical data being lost and the need for users to use Segments or VRS for their data. Also, not all data can or should be measured together with everything else. Luckily, there is a middle way.
Summary Report Suites in Analytics
Wouldn’t it be awesome if we had a flexible way to summarize our data within Adobe Analytics? Even without Visitor deduplication, having something similar to Rollup Report Suites within Analysis Workspace would be really cool. If it also included the option to pull data from unlimited “real” Report Suites for all Dimensions and Events for historical data, we would be quite happy.
That is our goal for today. We want to build this using only Adobe Analytics features in an expandable and automated way. To do that, we will utilize the Reporting API to pull data from our source Report Suites, process the data with a bit of Python and reimport it into Analytics using a Data Source.
The script we use for this is quite simple and can be extended to perfectly match what you need. As always, I’ve put the whole script on Github and will go over a basic version of it in this post. But before we can import any data, we need to take some preparations in Analytics. In summary, we need to follow those steps:
- Create a new Report Suite in Analytics to receive our summarized data
- Create a Data Source to import data
- Create a Python script to export data from Analytics using the Reporting API and reimport it using the Data Source
- Setup some automation to run the script however often we like
Let’s get our hands dirty!
Creating the Report Suite and Data Source
To get started, head over to the Admin section of Adobe Analytics and create a new Report Suite. Give it a descriptive name and select the settings however you like. If you want to combine data from multiple time zones, you should select the leading timezone when you create this new Summary Report Suite.
Next, we should change some settings. We are going to use a Data Source to import data, which means we cannot use Traffic Variables (props). I recommend you disable them in the Report Suite settings like this:

Now we need to decide which Conversion Variables (eVars) we want to use. We need at least one eVar for the import to work, so I recommend you set it to contain the Source Report Suite like this:

This is the minimal setup. Since we will only import summary data you need to include all the dimension you want to use in this new Summary Report Suite. For example, if you want to break down your data by device type, you need to set up an eVar and include it in the export. When you want to see your Unique Visitors on different granularities like daily, weekly, or monthly, you should also create an eVar to hold that granularity, because Analytics would just sum up your Visitors and thereby inflate numbers.
The next thing on our list are Success Events. Those will hold the actual numbers we are going to import, so we need one Event for each metric in our upload. We can disable or hide all other Events. If we wanted to import Unique Visitors, Visits, and Page Views, we could setup our Success Events like this:

That’s all we need for our Report Suite to work. To create the import, we head over to the Data Sources section in the Analytics Admin area. Create a new Data Source with the “Generic Summary Data Only” setting:

It doesn’t matter much which settings you choose in the wizard. The most important setting is the name, since we need to use it later. This is all we need to do in Analytics!
Building the import script
If you’ve followed this blog for a while, it should be no surprise we are going to use Python for our script and build on top of the work we’ve already done in the past. As always with those technical posts you can find the complete script on Github.
Before we start coding, let’s plan what we need to do:
- Authenticate to the Adobe Analytics API
- Fetch Metrics from a definable list of Source Report Suites
- Combine the results to a single array of rows to import
- Upload that array using the Data Source we created before
We start by defining our imports and config for the script:
import datetime
import requests
import sys
import jwt
import httplib2
config = {
"apiKey":"3ec159485be87ed8fk6f9g37j79d67153b31e6",
"technicalAccountId":"6JCD048F50A6495F35C8D9D4D2@techacct.adobe.com",
"orgId":"25DB24210614E744C980A8A7@AdobeOrg",
"secret":"d033109-fd7a71ba2-489-9cf455-f2f87f4298ab",
"metascopes":"ent_analytics_bulk_ingest_sdk",
"imsHost":"ims-na1.adobelogin.com",
"imsExchange":"https://ims-na1.adobelogin.com/ims/exchange/jwt",
"discoveryUrl":"https://analytics.adobe.io/discovery/me",
"key":b'-----BEGIN PRIVATE KEY-----\nMIIEvAIBADAN7wGu1P3aNA3yjqGA==\n-----END PRIVATE KEY-----',
"startdate": (datetime.datetime.today() - datetime.timedelta(days=8)).strftime("%Y-%m-%d"),
"enddate": (datetime.datetime.today() - datetime.timedelta(days=1)).strftime("%Y-%m-%d"),
"sourcersids":["suite1","suite2"],
"targetrsid":"suite3",
"datasourcename":"Summary Import"
}
Everything up to line 17 should be familiar from previous posts with the data we need for authentication and handling the Adobe Analytics API. The interesting things start in line 18, where we define the start date for the imported data. In this example, the import starts 8 days back from today, which you can adjust by changing the “days=8” to something like “days=90” for 90 days. Line 19 follows the same logic and defines the end data, which is set to yesterday in this example (today minus one day). The list in line 20 holds the Source Report Suites’s IDs that we are going to query for data. Feel free to extend this list however you like to include more Report Suites. Line 21 holds the ID of the Report Suite we created above which will receive data. Line 22 now contains the name of the Data Source we created before.
To handle the authentication to the Adobe APIs, we are using this code from the Adobe documentation:
def get_jwt_token(config):
return jwt.encode({
"exp": datetime.datetime.utcnow() + datetime.timedelta(seconds=30),
"iss": config["orgId"],
"sub": config["technicalAccountId"],
"https://{}/s/{}".format(config["imsHost"], config["metascopes"]): True,
"aud": "https://{}/c/{}".format(config["imsHost"], config["apiKey"])
}, config["key"], algorithm='RS256')
def get_access_token(config, jwt_token):
post_body = {
"client_id": config["apiKey"],
"client_secret": config["secret"],
"jwt_token": jwt_token
}
response = requests.post(config["imsExchange"], data=post_body)
return response.json()["access_token"]
def get_first_global_company_id(config, access_token):
response = requests.get(
config["discoveryUrl"],
headers={
"Authorization": "Bearer {}".format(access_token),
"x-api-key": config["apiKey"]
}
)
return response.json().get("imsOrgs")[0].get("companies")[0].get("globalCompanyId")
jwt_token = get_jwt_token(config)
access_token = get_access_token(config, jwt_token)
global_company_id = get_first_global_company_id(config, access_token)
Now we are authenticated. Let’s dive in how to get data from the Adobe Analytics Reporting API. We will be using the 2.0 API, since it is quite fast and flexible regarding things like Filters, Segments, and Calculated Metrics. A loop to get data on a daily granularity for our list of Report Suites looks like this:
resultrows = []
for rsid in config["sourcersids"]:
print("Fetching data for", rsid)
result = requests.post(
"https://analytics.adobe.io/api/"+global_company_id+"/reports",
headers={
"Authorization": "Bearer {}".format(access_token),
"x-api-key": config["apiKey"],
"x-proxy-global-company-id": global_company_id
},
json={
"rsid": rsid,
"globalFilters":[
{
"type":"dateRange",
"dateRange":config["startdate"]+"T00:00:00.000/"+config["enddate"]+"T23:59:59.999"
}
],
"metricContainer": {
"metrics": [
{
"columnId": "Visitors",
"id": "metrics/visitors"
},
{
"columnId": "Visits",
"id": "metrics/visits"
},
{
"columnId": "Page Views",
"id": "metrics/pageviews"
}
]
},
"dimension":"variables/daterangeday",
"settings":{
"dimensionSort":"asc",
"limit":"50000"
}
}
).json()
for row in result["rows"]:
values = []
for value in row["data"]:
values.append(str(value))
date = datetime.datetime.strptime(row["value"],"%b %d, %Y").strftime("%m/%d/%Y/00/00/00")
values.insert(0,rsid)
values.insert(0,date)
resultrows.append(values)
There is quite a bit to unpack here, so let’s go through it. In Line 1 we define the array that will hold our combined result rows from all our Source Report Suites, which we start iterating in line 3.
Lines 4-12 define the basic structure of our request to the Analytics API, but there is not much of interest there. Lines 12-41 define the actual request, which is much more interesting. I won’t go over all the details of how the Analytics 2.0 API works, but this JSON defines that we want a daily trended report (defined in line 36) with Unique Visitors, Visits, and Page Views as metrics (defined in lines 20-35) for the date range we set at the beginning (defined in lines 14-19) from the current Report Suite (line 13). If you feel unsure about how to create this structured request, there is a simple way to get it directly from Analysis Workspace.
The result from our requests is then converted to a JSON format (line 42). This helps us to iterate the result from line 44 onward. In our simple example, we just add the current row of data to the resultset (line 46-47), where we convert the value to a string. Since the Reporting API gives us a date value like “Aug 1, 2020”, we need to convert it to a format the Data Sources API understands (line 48). If you would need to convert your imported data to a different timezone, this would be the place to do that as well. Now we just need to add the Report Suite ID (line 49) and date (line 50) to our array and append it to the larger resultset (line 51).
We need to query the Analytics API one more time to get the ID of the Data Source we created before. To do this, we list the available Data Sources in our Target Report Suite and compare the name to how we named our Data Source above. If we find the ID, we save it to a variable:
dataSources = requests.post(
"https://api.omniture.com/admin/1.4/rest/?method=DataSources.Get",
headers={
"Authorization": "Bearer {}".format(access_token),
"x-api-key": config["apiKey"],
"x-proxy-global-company-id": global_company_id
},
data={'reportSuiteID':config["targetrsid"]}
).json()
for dataSource in dataSources:
if dataSource["name"] == config["datasourcename"]:
dataSourceID = dataSource["id"]
print("Found Data Source ID")
break
That’s all we need. With the combined resultset and the Data Source ID, we are able to send the data to our new Report Suite using the DataSources.UploadData method from the Analytics API:
jobresponse = requests.post(
"https://api.omniture.com/admin/1.4/rest/?method=DataSources.UploadData",
headers={
"Authorization": "Bearer {}".format(access_token),
"x-api-key": config["apiKey"],
"x-proxy-global-company-id": global_company_id
},
json={
"columns": ["Date","Evar 1","Event 1","Event 2","Event 3"],
'reportSuiteID': config["targetrsid"],
'dataSourceID':dataSourceID,
"finished": True,
"jobName": "Summary Import",
"rows": resultrows
}
)
print(jobresponse.json())
Pay close attention to line 9. This is where we define the structure of our upload, which must match the number of columns we defined in our resultset above. In our example, we are uploading the Date, Evar 1 (which holds the Report Suite ID), and Events 1 through 3 for Unique Visitors, Visits and Page Views. If we want to include more Metrics or Evars, or use different numbers, we need to change them here and in the resultset.
If we want to identify our upload later on, we can give it a special name in line 13. Line 18 prints the result, which should be “True” if everything goes well.
That’s all we need. If we run this script, it fetches our data and uploads it to our Target Report Suite. You could run this script once with a long date range to import historical data. Then you could create a daily Cron Job only importing the last day. If you ever need to change or extend the imported set of data, just change the script and run the import again!
Using our combined data in Workspace
Now that we have our data from multiple Report Suites imported into Adobe Analytics, we can use our newly created Summary Report Suite to report on it. We can use all the features we know and love from Workspace with our custom Evars and Events
Since we only imported summarized data, there are a few things we need to keep in mind:
- Unique Visitors and Visits are not deduplicated, neither across Source Report Suites nor for days and weeks. Analytics will happily sum your daily Unique Visitors if you use them with weekly granularity. To tackle this, you could create another Evar to contain that granularity and filter your report by it for those two dimensions. You should generally not use the Total and Grand Total for those Metrics.
- Segments only work as filters now. You can create them, but they will only work on a HIT level.
- Builtin Metrics won’t work at all. You should only rely on the Metrics you import.
- Same goes for the builtin Dimensions like Countries, Devices, etc., unless you manually import them as breakdowns.
Right now we only have Report Suite IDs to discern between the Sources, which is not very user friendly. You could either include more information about the source in another Evar or use Classifications for metadata. I highly recommend the latter, since it saves on Evars and allows you to change or extend the information later on. If you are summing up Report Suites that hold different products or countries, you could include that information as classification dimensions.
All things considered, this is a pretty neat solution to combine data from multiple Report Suites in Adobe Analytics. It’s flexible, retroactive, and completely free! Personally, I’m quite happy when I now see a graph like this:

I hope Adobe is already working on a “real” solution to this business need. I would love to be able to create a Virtual Report Suite that combines data from multiple Report Suites with Visitor deduplication (and the Visit Number dimension!) But until we get there, we can use this neat little hack.
Frequently asked questions
There are multiple options. You can always bring your metrics to another tool like Excel or Power BI, but that will likely not be very scalable. Instead you can pull your data out of Adobe Analytics and re-import it with a data source. That gives you a combined view in the same tool for free! Of course you could always use Adobe’s Customer Journey Analytics for the same purpose.
No, not really. Rollup Report Suites are very old and cumbersome to use. If you want to combine data from multiple Report Suites in Adobe Analytics, either use a Summary Report Suite with Data Sources or consider Adobe’s Customer Journey Analytics.
No, they are super easy to set up and use via APIs or manual uploads. For example, they can be used to combine data from multiple Report Suites.

German Analyst and Data Scientist working in and writing about (Web) Analytics and Online Marketing Tech.