Adobe Analytics is, without a doubt, the most mature Digital Analytics solution we can buy today. We know and love it for the flexible data collection, superior data model and processing, as well as the phenomenal Analysis Workspace interface. I have yet to find a use case I can’t cover with the tool or a platform where it can’t be implemented.
While Adobe Analytics works perfectly for the data we collect from digital experiences, companies often use many more digital tools to manage advertisements, analyze search performance, post to social media platforms, etc. Most of those tools have some form of integrated reporting feature, which can hold crucial information to monitor a company’s offsite presence.
With growing online activity, this can create a situation where the people working with many tools have to switch between tools many times per day. To help manage this complexity, companies often consider using either an existing Business Intelligence solution, like Power BI or Tableau, or even create a dedicated solution just for marketing data, like a Marketing Data Warehouse. In an effort to reduce complexity this approach often actually increases complexity by introducing even more tools.
In this post, I want to explore a slightly different approach: Instead of introducing a dedicated reporting tool next to an already existing Adobe Analytics instance, why not bring all the other data into Analytics and use it as a central information hub? Ideally, we could have an integrated Analysis Workspace Project like this:
As you can see, it is very much possible! To achieve this, we are going to utilize Data Sources and a dedicated Data Warehouse Report Suite. Let’s start!
Common ways to bring data into Adobe Analytics
Before we start, let me quickly mention how we normally bring data into Adobe Analytics. In most cases, customers will use the available Java Script libraries or SDKs to send data directly to Adobe Analytics. On a technical level, those Adobe-provided ways leverage the Data Insertion API, which can also be used directly for interactive experiences (as I’ve demonstrated in my post about Voice Assistants) or for post-hoc batch events. With the new Web SDK and Edge Network, we even have one more endpoint to send data to.
While those methods can be used to bring the actual event data into Adobe Analytics, we have another option to enrich the already collected data through Classifications. This is the best way to bring meta data, like product information or user attributes, into Analytics after the data has already been collected. There are some super easy ways to automate the process, for example through Google Sheets. A special form of this is the Customer Attributes feature, which is purpose-made for user-related meta data.
On top of those two well-known methods, we have one more option: Data Sources. With Data Sources, we can bring any summary-level data we want into Adobe Analytics (or even combine data from multiple Report Suites directly in Analytics), from every source we can imagine. As I’ve shown in my post about bringing Google Analytics data into Adobe Analytics, the process can be anything between manual imports to completely automated uploads through an API, or even a mix of both. And the best part: It is completely free! While we need to pay for the conventional event data described above, there is no cost associated with Data Sources’ data. Sounds like the ideal approach for our experiment today!
Creating a Data Warehouse Report Suite in Adobe Analytics
As the first step, we go over to the Admin section in Adobe Analytics and create a new Report Suite. We don’t need to use any template, as this Report Suite will be pretty different from a normal Report Suite (if you remember my post about Summary Report Suites, you already know that I like my special-purpose Report Suites quite a bit!)
After the Report Suite has been created, we can think of the actual data we want to import. For my demonstration, I want to go with the dimensions listed below, each configured as an eVar in the Admin section:
- The data source type, like “Imported” for external systems or “Summarized” for data from other Adobe Analytics Report Suites
- The source system, like “Google Ads”, “Facebook”, or “Twitter” (if your company is still brave enough for that)
- The traffic channel, like “Search”, “Display”, “Social”, etc.
- The traffic or ad type, like “Paid” or “Organic”
- The campaign name
- The ad or post name
While those cover a broad range of use cases already, you are completely free to extend the list to whatever you need. Other dimensions like keyword match types, targeting, audiences, and others could very well be additional eVars. As a personal recommendation, I would always try to keep those dimensions as abstract as possible so it can be used for all required source systems. For my example, my list of eVars looks like this:
As you can see, I’ve changed the Expiration from Visit to Hit. That’s not technically necessary (the expiration is ignored for imported data) but helps with my OCD when it comes to consistent definitions across Report Suites.
Next up, we should configure our Success Events. Those will be needed to capture the actual metrics we want to import with the dimensions created above. Again, I’m trying to be as abstract as possible here, as we will be able to use Analysis Workspace’s amazing features to differentiate between systems later on. With this, my desired metrics for now are:
Pretty straight forward, right? This is how it looks like in the interface:
Besides the names, there’s very little I’ve changed from the default settings. To spice things up, I’ve inverted the polarity of the Cost metric (assuming high cost is bad) and switched it to the Currency type. Easy, right? Now, let’s get ready to import data!
Creating a Data Warehouse Data Source in Adobe Analytics
To be able to import our data, we need to create a Data Source first. This acts as the destination to which data is imported. In the Admin section, head over to the Data Sources section and create a new Generic Summary Data Source, like shown below:
Clicking “Activate” does not actually activate anything (that would be far too simple!) but opens the creation wizard. You can name your Data Source in the first step (and ignore the note about additional fees, which is very confusing as it’s not applicable here), and go on to step 2. Here you need to, for whatever reason, name at least one metric. This field holds no real meaning at all, you can just put in a sarcastic value like me:
You can then choose any actual event in step 3 (again, it doesn’t matter, just choose anything or nothing. Just make sure you don’t check the renaming field). You can again get creative in step 4, where the input does not matter again:
Map it to anything you fancy in step 5, briefly check the summary on step 6, then go on to the actually relevant page 7:
You can download a not very helpful template here, but the actually important part is the FTP account. We will need this later, so make sure to note it down. Don’t worry if you have closed the wizard already, you can always view the information again later. That’s it for this step!
Creating the Data Sources Input File for Adobe Analytics
Now, let’s create the file we want to import into Adobe Analytics. On your computer, create two files: A .txt file with the actual data and an empty file with the same name but a .fin extension. This is important to let Adobe know when the import is uploaded successfully and can look something like this:
Now, edit the Import.txt file with an editor of your choice. I found Visual Studio Code pretty easy to handle. To start, we need to list the Dimensions and Events in the header row. The columns need to match your configured eVars and Success Events, eVar 1 through 6 and Event 1 through 4 in my case. Note that we need to separate values with a tabstop:
Date Evar 1 Evar 2 Evar 3 Evar 4 Evar 5 Evar 6 Event 1 Event 2 Event 3 Event 4
As you will have expected, we now need to fill those columns according to our setup. In my example, I have inserted some demo data for four days and an imaginary campaign running on Facebook and Google Ads:
As you can see, I will never get a job in marketing, but the file is pretty straight forward. The date column needs to be in the
wrong US format, followed by the actual data in some more tab-separated columns. You can also use Excel to create those files, but be aware it might mess with the format.
Next, you want to upload the files to the FTP server from the last step of the wizard. I like to use Filezilla, but any other client will work too. Just be aware that you need to first upload the .txt file, then the .fin file so Adobe Analytics knows the upload is completed. If there is something wrong with your file, you will get an email from Adobe which tells you what went wrong. If you get an error about the wrong number of columns, double check that you have used a tab separator everywhere and all rows contain all fields!
A few moments after the .fin file has been uploaded, you will notice that both files disappear from the FTP as they get imported. Once that has happened, we need to wait a little while until the data becomes visible in Analytics. Now, let’s check how it looks like!
Visualizing Imported Data in Adobe Analytics
Now it’s time to validate our import and take a look at our data in Analysis Workspace! The good news: There is absolutely no difference between imported data and “real” data, it shows up in exactly the same way our usual data does! And because there is very little processing required to query the data, you reports will be blazing fast! Here’s how it looks like for me:
Ain’t it beautiful? Our manually created and uploaded file makes the data appear in Analysis Workspace as if it was originally collected (,) by Adobe! Now we can see precisely how our metrics look like in total and, in the lower table, per platform. Looks like we’re only advertising to bots on Facebook, how useful to know! And since we’re in Analysis Workspace, we can just create nice things like Calculated Metrics from our imported data, like in my last column here, where I quickly created a Cost per Click metric:
Of course we can also use any of our imported dimensions as a filter, for example a little something like this:
With all that amazing functionality, it is now quite simple to provide our stakeholders with a nice, interactive dashboard like teased in the beginning. Note how the dimensions are used as drop down filters on the top, so there is actually interactivity in there. The trend lines in the line chart also change with the selection:
With all of that done, it is now time for the…
What a fun project! I think we have proven quite impressively that Adobe Analytics can absolutely be used as a Data Warehouse to unify reporting from many different data sources. The manual import makes it super easy to start and leaves the option to automate the whole process through the API, potentially using API management tools like Supermetrics. Not bad for a completely free option!
Using Adobe Analytics this way is of course not the core functionality, so there are a few things to keep in mind. For example, there is no Visitor or Visit de-duplication, so the reach of each platform is just summed up in our example. This might make it necessary to create daily, weekly, and monthly versions of metrics and teach users when to use which. The same is true for most BI solutions too, but it should be kept in mind. Similarly, correcting falsely ingested data can be done (through uploading another file with negative metrics) but is more tedious than going into the database of a proper DWH.
With this amazing functionality, Adobe Analytics still has a bit of an advantage over its future successor Adobe Customer Journey Analytics, where there is no comparable feature today. Good to know our good old Adobe Analytics still has some unique tricks up its sleeve!
As always, I’m curious to hear which applications you can think of. Let me know and have a great day!