The market for digital analytics solutions is going through some big changes. Google continues to annoy customers by removing features from the universally hated GA4, which helps companies like Amplitude, Piano, or Piwik Pro to gain traction in a diversifying market. On the receiving end, vendors are eagerly trying to differentiate themselves by claiming that “product analytics” or “marketing analytics” could only really be done with their tools. Adobe has been quite busy defending their market leadership with their own new tool, Customer Journey Analytics. While I’ve been one of the very first customers of this next-gen Adobe Analytics, my general recommendation has been to stay with (or buy) Adobe Analytics instead, as CJA was still missing some important features for anyone wanting to replace AA. However, this recommendation has now finally changed. As you can see in my comparison post, my general recommendation is now to buy Customer Journey […]
Tag: Web Analytics
2022 Offsite Content Roundup
Welcome to the first of potentially many posts in a newly-found yearly tradition. At some point during the year, I noticed that I don’t have a continuously updated list of the things I’ve been up to outside of this blog. And even besides that, it is sometimes hard to keep up with all the content that eventually makes it onto the blog. To help with both, I’m planning to release a post like this every year, potentially in the last week of the year. I’m going to try my best to remember all the conferences, events, webinars, and foreign posts (or even update a draft post throughout the year) and put them in some sort of order. Thank you for another great year on this blog and I hope you enjoy all the different pieces of content! I have a YouTube channel now! Writing blog posts definitely is my go-to […]
Using Adobe Analytics as a free Data Warehouse
Adobe Analytics is, without a doubt, the most mature Digital Analytics solution we can buy today. We know and love it for the flexible data collection, superior data model and processing, as well as the phenomenal Analysis Workspace interface. I have yet to find a use case I can’t cover with the tool or a platform where it can’t be implemented. While Adobe Analytics works perfectly for the data we collect from digital experiences, companies often use many more digital tools to manage advertisements, analyze search performance, post to social media platforms, etc. Most of those tools have some form of integrated reporting feature, which can hold crucial information to monitor a company’s offsite presence. With growing online activity, this can create a situation where the people working with many tools have to switch between tools many times per day. To help manage this complexity, companies often consider using either […]
Understanding and getting the most out of Activity Map in Adobe Analytics
Adobe Analytics offers a metric ton of useful features which make website tracking super easy. Next to builtin dimensions like Referrer Type or Entry and Exit Page, there are many clever metrics like Entries and Exits for any given dimension. Today, I want to show you one of my favorite features and explain why I love it so much: Activity Map! Activity Map, also known as Click Map, helps Adobe Analytics users understand where website visitors, you guessed it, click on a given page. It comes with a bunch of dedicated dimensions, its very own user interface, and even is super easy to implement through Adobe Launch. In this post I’m going to cover some basics of Activity Map, explain why you should use the dedicated interface more often, and how you can customize it for your very own use cases. And once you’re done reading this post, watch Jenn […]
Should you buy Adobe Analytics or Customer Journey Analytics for Web Analytics use cases today?
The market for web analytics tools is going through some big changes right now. On one hand, Google has officially announced to sunset Universal Analytics (also known as “the ‘good’ Google Analytics”) next year in favor of the universally hated GA 4. On the other hand, Adobe is heavily promoting their new tool Customer Journey Analytics, both to new and existing Adobe customers. There is no doubt that, at some unannounced point in the future, Customer Journey Analytics will become Adobe’s only analytics tool. However, there is one important detail about Customer Journey Analytics that we need to keep in mind: It is not a dedicated web analytics tool like good old Adobe Analytics. Customer Journey Analytics is built to handle data from all sources you can think of, not just websites and apps. Because of that, Adobe Analytics still has some unique features, like the Visitor Profile, that are […]
Understanding and utilizing the Event Object in Adobe Launch
Adobe Launch provides a lot of comfort and clever features to make tag management super easy. Quite a few of those features, like the amazing Constant Data Element, are provided directly in the Launch UI. However, there are a bunch of nice surprises waiting for us once we start exploring the more backend-y technical capabilities. One place where we can catch a glimpse of what Launch has in store for us is in the configuration of Custom Code Actions from the Core Extension. All we have to do is hover our mouse over the Execute globally tool tip: Adobe points us towards three technical variables which we can use in our implementations. One of them is, in my experience, especially useful: The event object. But what is it and why should you use it? Let’s start exploring! What is the Event Object in Adobe Launch? As you are probably aware, […]
Creating Marketing Channel Stacking and Pathing Reports from Adobe Analytics with Power BI
Ever since my very first attendance, Adobe Summit is my number-one source for inspiration for new things to try out in Adobe Analytics. When the world’s leading practitioners and product experts from Adobe come together to share their knowledge, there is a lot to learn for everyone. This year, Eric Matisoff invited me to share a visualization I created in Analysis Workspace as part of the Analytics Rockstars session. However, the true Rockstar content in that session was the Tips & Tricks shared by Jenn Kunz using Excel with the Flow viz in Workspace. A followup conversation on Measure Slack then unveiled some improvements using Data Warehouse and reminded me of an approach of my own that I want to share today. Some years back I used Adobe Analytics’ Data Feeds with Elasticsearch and Grafana to analyze marketing performance beyond what Adobe Analytics has to offer. While that was a […]
Cookie-less Server Side Tracking with Adobe Customer Journey Analytics
If there is one big hot topic in digital analytics right now (besides the unfortunate sunset of Google Analytics 3 and GDPR news) it quite possibly is the recent trend of what many call server side tracking. Currently, server side tracking is an obligatory agenda item at every analytics conference and virtually every vendor of analytics or tag management systems is working on a way to serve the rising demand. However, while there is a lot of talk around the topic, there is no shared definition in our industry of what server side tracking actually is. Jim Gordon has assembled a nice overview of what people might mean when they talk about any of the underlying concepts. In my personal experience, people usually refer to a form of server side tag management, often using Google’s server side tag manager, that still uses some logic in the client’s browser. Adobe has […]
Why you should use the Constant Data Element more often in Adobe Launch
Adobe Experience Platform Dynamic Data Collection Tags, by Adobe Launch is the most prevalent Tag Management System for Adobe Experience Cloud customers. It is nicely integrated with the available Adobe Solutions and is offered completely for free to those customers. But while it is best enjoyed to fuel other solutions, it is a beast of its own with some unique features and mechanics. I’ve already written a few posts about Adobe Launch’s particular features, like the ability to use call traces between rules or how rules can be executed exactly once. Today I want to focus on a specific Data Element, called the Constant Data Element Type. It has first been created by Jan Exner as a dedicated Launch Extension but has (thankfully!) later been built into the Core Extension. I’m going to show you three of my favorite use cases for this, in my opinion, shamefully overlooked gem. You […]
Exploring Adobe Launch Server Side (aka Adobe Experience Platform Data Collection Event Forwarding)
In digital analytics, there has been a trend lately to move data collection away from the client towards a server side implementation. In most cases, companies try to circumvent technical restrictions like Apple’s “Intelligent” Tracking Protection, make collected data more consistent across analytics tools and marketing platforms, or hide their non-GDPR-compliant setups from their users. This trend also brings some (but not all) elements of tag management to the server side, as Jim Gordon described well. In most scenarios, data and events are collected from the client (like a website or app) using a tag manager. Instead of sending events directly to, for example, Adobe Analytics, Google Ads, Facebook, etc., from the browser, they are first sent to a common endpoint that collects, enriches, and forwards data to the desired tools. This common endpoint is usually referred to as a server side tag manager and is implemented in a first-party […]
Should you really build an Adobe Launch Extension?
If you are using any solutions from the Adobe Experience Cloud family of tools, chances are you are also using Adobe’s Tag Management System, Adobe Launch. Launch works like many other Tag Management Systems (TMS), in that it can bring tools and code to a website without the need to change the source code of the website it is running on. This allows the users of the TMS, typically digital analytics or digital marketing teams, to be somewhat independent from IT and development teams when they want to bring technology, like onsite analytics or marketing pixels, to a page. In Adobe Launch, Extensions are used to bring the actual functionality to the websites that it is running on. A good example for this is the Core Extension: This Extension allows us to run JavaScript code on the website (using the Custom Code Data Element or Action), listen for events like […]
5 Awesome Adobe Analytics Classification Use Cases
Adobe Analytics has a rich set of features that help businesses collect and report on data in most efficient ways. I’ve written a lot of posts already on how it can be configured to collect data in better ways or how the interfaces can be used to its full extent. Today I want to show you some use cases for one of my favorite features, as it can help you in every stage of an analytics project: Dimension Classifications. Using Classifications, at least in my opinion, is a big step towards making the best use of Adobe Analytics. It can help simplify implementations, manage resources (like the number of used dimensions), surprise users with great new features, or even help you correct errors in data collection retroactively. But before I start with the use cases, let’s make sure we are on the same page by asking… What are Classifications in […]
Import Google Analytics data into Adobe Analytics using Data Sources
On one hand, Adobe Analytics remains my favorite web analytics tool on the market. The longer I use it, the more I appreciate all the well thought-out features, from data collection to processing, storage, and analysis. Those features are even more impressive when compared with what Google Analytics has to offer. And yet, on the other hand, even I can’t avoid having to work with Google Analytics in some way or another. In a large, global company, it is basically unavoidable to find Google Analytics on some small, long forgotten marketing landing page in some market. It gets even worse: Up until last year, I personally had to maintain an inherited Google Analytics instance on a legacy website and app. What a cruel world! Besides those cases, where someone in your company actually wants to use Google Analytics, there are also more forgivable cases. For example, a company may be […]
Keep track of goals using the Linearity Indicator in Adobe’s Analysis Workspace
There is an universal truth in life: Inspiration always strikes when and where you least expect it. The same happened to me the other day, when I was reading High Output Management by former Intel CEO Andrew Grove. While the book is definitely worth reading for anyone interested in management, analysts can benefit just as much from reading it to get inspiration for valuable performance indicators and visualizations. Quite early in the book Grove presents one of his favorite visualizations to track progress towards specific goals: The linearity indicator. This chart shows the current progress towards a set target and where the performance might be heading. Here is his example for a hiring target from the book: My initial reaction was “wow, this is super cool and simple to understand”. If the current progress is above the linear progress, we’re in good shape to reach our goals. If it is […]
Monitor Adobe Analytics usage for free with Power BI and Python
Adobe Analytics is, without a doubt, the most complete and feature-rich product for Web Analytics and Reporting on the market. I won’t even try to list all its features, since I would definitely forget some or would have to update the list in a few months as new functionality is released. And while I, as an analyst and power user, love to have all those great tools available, they create a challenge for me in my other role as an analytics admin. All of those features bring complexity to the every-day work of our business users. For example, when Analysis Workspace was released in 2016, it meant that users had to learn a new interface to get the most value out of Adobe Analytics. But as an admin who knows their users, I have a strong feeling that some people still use the old Reports & Analytics interface in 2021. […]
Web Analytics with Adobe’s Customer Journey Analytics, Part 8: A new home
This post is the eight and last post of the eight-part-series Web Analytics with Adobe’s Customer Journey Analytics, showing how web sites can be analyzed better using Adobe’s next evolution of Adobe Analytics. In the previous post, we were creating the connection from Experience Platform to Customer Journey Analytics. In this post, we are going to take a look at our web analytics data and explore some use cases. Believe it or not, but this series of posts is almost finished! Starting with nothing, we have created a sophisticated schema for our data in Experience Platform, created a tracking implementation using the new Web SDK, enriched our data in Query Service, and pulled all that data into Customer Journey Analytics. If you have been following since the start of the series, I want to say: Thank you, hope you enjoyed the ride! Now it is time for the finale, where […]
Web Analytics with Adobe’s Customer Journey Analytics, Part 7: Customer Journey Analytics Backend Configuration
This post is the seventh post of the eight-part-series Web Analytics with Adobe’s Customer Journey Analytics, showing how web sites can be analyzed better using Adobe’s next evolution of Adobe Analytics. In the previous post, we were enriching our basic web analytics data with some advanced fields in Query Service. In this post, we are creating the connection from Customer Journey Analytics to Experience Platform. At this point in this series, we have a world-class dataset of web analytics data in Experience Platform, ready to be analyzed. I’m personally very proud of the things we were able to achieve in Query Service, especially with the pathing dimensions. With all of that, we have even more than what normal Adobe Analytics would give us! With all the data enriched, we now have only one step left before we can start analyzing our digital user’s behavior. First, we need to pull data […]
Visualizing Adobe Analytics Report Suites for free with Python and Power BI
Adobe Analytics is super flexible in the way it can be set up to exactly match all requirements of business users, analysts, and developers. A crucial part of any implementation is the creation and configuration of the Report Suites, which can be seen as the backend database of Adobe Analytics, that will hold the events sent to Adobe Analytics. In theory (and practice in some setups), each and every Report Suite can have a completely individual set of variables and metrics. However, having the option to create an individual configuration of dimensions and events for each Report Suite comes with a hefty long-term cost. For example, each and every setup needs to be implemented in Adobe Launch, where the on-page data layer needs to be matched to the dimensions and metrics of the Report Suite. If every Report Suite is configured differently, a lot of work needs to be put […]
Web Analytics with Adobe’s Customer Journey Analytics, Part 6: Advanced Data Processing in Query Service
This post is the sixth post of the eight-part-series Web Analytics with Adobe’s Customer Journey Analytics, showing how web sites can be analyzed better using Adobe’s next evolution of Adobe Analytics. In the previous post, we took a look at processing some basic data we need for our web analytics use case utilizing Query Service in Experience Platform. In this post, we are creating some advanced fields to our data in Query Service. I think it’s fair to say that even with just the information from the previous part, we could have a very useful web analytics tool already. But if you know me, you know that I like to take things to the next level wherever I can, especially if it involves writing code. And is SQL not some sort of code too? Entry and exit page were a nice start last time, but we have some fields still […]
Web Analytics with Adobe’s Customer Journey Analytics, Part 5: Basic Data Processing in Query Service
This post is the fifth post of the eight-part-series Web Analytics with Adobe’s Customer Journey Analytics, showing how web sites can be analyzed better using Adobe’s next evolution of Adobe Analytics. In the previous post, we took a look at doing the implementation using Adobe Launch, the Adobe Web SDK, and Client Data Layer. In this post, we are going to processing some basic data we need for our web analytics use case utilizing Query Service in Experience Platform. This series of posts is coming along quite nicely. If you followed all the previous posts until now, you will now have a functioning Web SDK implementation that tracks your data into Experience Platform following the Experience Data Schema we have tailor-made for our use case. Nice! Now we are ready to feed our data into Customer Journey Analytics, right? Well, we could. If we are just interested in the plain […]
Web Analytics with Adobe’s Customer Journey Analytics, Part 4: Capturing Data with Web SDK (Alloy)
This post is the fourth post of the eight-part-series Web Analytics with Adobe’s Customer Journey Analytics, showing how web sites can be analyzed better using Adobe’s next evolution of Adobe Analytics. In the previous post, we took a look at our business questions and how we can structure our data most effectively. In this post, we are doing the actual implementation using Adobe Launch, the Adobe Web SDK, and Client Data Layer. On our way to creating a full-scope, front-to-back implementation of Customer Journey Analytics to track a web site, we are now ready to think about our actual implementation. Since we have the data structure in place and already have an awesome Experience Event Schema, we just need some actual data. The logical choice to feed data to the Adobe stack is, of course, by utilizing their client-side tools as well. Specifically, we are going to use Adobe Launch […]
Web Analytics with Adobe’s Customer Journey Analytics, Part 3: Data Structure in Experience Platform
This post is the third post of the eight-part-series Web Analytics with Adobe’s Customer Journey Analytics, showing how web sites can be analyzed better using Adobe’s next evolution of Adobe Analytics. In the previous post, we took a look at the different possible solution architectures we can use to bring data into Customer Journey Analytics and decided on the best one. In this post, we will take a look at our actual business questions and how we can structure our data most effectively. From the last post we already know that we want to track data using only the new Adobe Web SDK going forward. To make that work, we need to create a schema in Experience Platform first, which defines the structure of the data that we want to capture. While some people (sometimes me included) see schema management as one of the more tedious tasks in Platform, I […]
Web Analytics with Adobe’s Customer Journey Analytics, Part 2: System Architecture in Experience Platform
This post is the second post of the eight-part-series Web Analytics with Adobe’s Customer Journey Analytics, showing how web sites can be analyzed better using Adobe’s next evolution of Adobe Analytics. In the previous post, we discussed the motivation and scope of this project and why, eventually, existing Adobe Analytics customers will start moving to Adobe’s Customer Journey Analytics. In this post, we will take a look at the different possible solution designs we can use to bring data into Customer Journey Analytics and decide on the best one. Adobe’s Customer Journey Analytics is built on Adobe’s brand new Experience Platform. With that, it is very flexible in terms of how data can be brought into the tool. Depending on the setup it may seem very easy to bring data in quickly. However, all that flexibility also means we have many ways to deviate from the ideal path, so we […]
Execute a rule only once in Adobe Launch
When you are managing an implementation of Adobe tools like Analytics, chances are you are using Adobe Experience Platform Data Collection, formerly known as Adobe Launch, Adobe Dynamic Tag Management, or the Adobe Tag Manager (and no, this is not a SEO text, it’s the actual list of names. I will still call it Launch for the near future) to implement other tags as well. While Launch is great for making implementation of Adobe’s own tools very fast and easy, managing other tools is not always so straight forward. A common requirement for those 3rd Party tags is to fire a certain tag or pixel only once. What once really means (once per session, user, day, year?) might differ from tag to tag, so as a result it can be surprisingly difficult to fulfill those requirements reliably and consistently. On top of those varying definitions of once, Adobe Launch has […]
Web Analytics with Adobe’s Customer Journey Analytics, Part 1: Goodbye Adobe Analytics, my Old Friend
This post is the first post of the eight-part-series Web Analytics with Adobe’s Customer Journey Analytics, showing how web sites can be analyzed better using Adobe’s next evolution of Adobe Analytics. In this part, we discuss the motivation and scope of this project and why, eventually, existing Adobe Analytics and new customers will start moving to Adobe’s Customer Journey Analytics. If you found this article, chances are high you work in or adjacent to the field of digital analytics or web analytics. It doesn’t really matter if you are an existing Adobe Analytics user, on the Google stack, or just looking for your very first web analytics tool. If you have been following the trends and discussions in our industry in the recent time, you will likely already have caught on the massive changes that both our industry and Adobe’s products go through. With changes to privacy requirements and cookie […]
The Visitor Profile: Adobe Analytics’ Big Advantage
We live in some very exciting times for our industry. There is a lot going on in the analytics space with Adobe’s brand new Experience Platform, Customer Journey Analytics, Web SDK, and Launch Server Side. All of those new innovations will fundamentally change how we track data and process data once it has been collected. But since I got the opportunity to try out most of those exciting things myself, people often ask me why I still love Adobe Analytics as much as I do. My answer to that usually covers multiple areas. For example, I like how the App Measurement Library in Launch helps me to collect data efficiently. Analytics’ Processing Rules and Marketing Channels are another great tool to enrich our events after the collection. In a previous post, I already explained why I love prop-type dimensions quite a lot, since they can provide us with metadata on […]
Building the ultimate auto tracking implementation with Adobe Experience Platform and Web SDK
I think it’s no secret that a lot of companies, agencies, and analyst dread the amount of effort it takes to implement a sophisticated analytics tool like Adobe Analytics. That may come in part from the correlation between company size (and thereby business complexity) and choice of analytics tool, but it is quite clear that implementing Adobe Analytics in a way that fully utilizes both all of its countless features and what can be collected from a page is a challenge to even the most experienced specialists. This is where other tools like Google Analytics or smaller solutions like Matomo have their place. If your use case and business situation are right, they may be a quicker solution for you. The simplicity is quite tempting but would not be enough for larger businesses. That leads to a funny situation when people from agencies or small companies join a large corporate […]
Using Flow and Fallout Visualizations like a Rockstar in Adobe Analytics
It’s no secret: I love Analysis Workspace. In fact, I think it is the main advantage Adobe Analytics has over Google Analytics. That is because Workspace allows for seamless collaboration between analysts, marketeers, product owners, and other business stakeholders. With enough enablement, there is no difference in which tools different groups of analytics users would use: It’s always the best one! Workspace is the perfect combination of sophisticated functionality and an appealing user interface. But because of this user-friendly interface, not every advanced function or use case is immediately apparent to every user. This can lead to funny situations, where experienced analysts never really use certain parts of Workspace that could save them a lot of work. In today’s post we will take a close look at two of the most undervalued features: The Flow and Fallout visualizations. While they seem quite similar in functionality and trivial to understand on […]
Announcing the open collection of Adobe Analytics best practices
Imagine a situation like this: You are facing a new challenge when using or implementing Adobe Analytics. What do you do? If you are like me, you first check out the documentation to make sure you’ve understood the available features correctly. Then, you start researching blog posts and articles around your topic to see if and how anyone has solved this before. If you are still unsure, you might ask some people on Twitter, LinkedIn, or the Measure Chat. As a last resort, you might even reach out to Client Care and ask for help. It’s easy to see why this approach is not ideal. First, it’s not easy to know if the way you approach a task is still the best way or if new solutions exist. Depending on which pages you found when researching, you might end up with an outdated solution or contradicting approaches by different authors […]
(Time-)Normalize Performance over time in Adobe Analytics’s Analysis Workspace
In Digital Analytics, one of the most common requests from business stakeholders is to compare the performance of two or more items on our websites, like marketing campaigns or content pages. While it is immediately obvious why this comparison is important to the business, it quite often leads to graphs like this, where the analyst tries to visualize performance over time: This solution is technically correct but makes it hard to really compare how both pages perform in direct comparison with each other. They went public on different dates and while Page A is rather stable in regards to traffic, Page B got a boost at around the middle of its time online. So, how do we make this simpler? When enjoying my free time between jobs, I caught up on some older videos from the Superweek Analytics Summit’s Youtube Channel. In 2019, Tim Wilson demonstrated how to align dates […]
Please, stop comparing Adobe Analytics to Google Analytics
This post is going to be a deviation from the “normal” content on this blog. Its purpose is to address one of the questions I received most often from a lot of people reading my posts. The title might already give away what that question is: “Frederik, in your opinion, should companies buy Adobe Analytics or Google Analytics?” And I think there is something fundamentally wrong with this question. I think the above question can only be answered through some absurd level of generalization that does not do justice to both tools. There are some agencies or consultants who end up doing this comparison to either appear neutral and independent, or drive SEO traffic to their own sites. This annoyed me to a point where I started writing this post to have my personal answer ready at hand in the future. Bear with me on this one. To be able […]
Cool Approximate Count Distinct Use Cases – Adobe Analytics Tips
One of the things that really sets Adobe Analytics apart from other solutions is the ability to create sophisticated Calculated Metrics and Segments on the fly. You don’t need to be a highly trained Analyst or Data Scientist to create your very own set of Measures and Dimensions unique to your business question. The best thing for me personally is that we can create those metrics from the same interface where we do our day-to-day analysis and reporting. It doesn’t matter if we want to quickly create an average or build advanced time series analysis dashboards, it’s all right there at our fingertips. Today I want to tell you about one of my personal-favorite functions called Approximate Count Distinct. This functionality allows us to count how many different values from a dimension we tracked and use that number in both Calculated Metrics and Segments (making this function the closest we […]
Privacy-centered Analytics with Matomo and Adobe’s Customer Journey Analytics
Legal Disclaimer: Data Privacy is a diverse and ever-changing topic. This makes it nearly impossible to give reliable recommendations to a broad audience. Please consult your company’s legal department on whether those ideas described here are feasible under your jurisdiction. If there has been one predominant topic in the web analytics space for the last couple of years, it surely is data privacy. GDPR is a thing in Europa, COPPA in the US, ITP on planet Apple, and cookie consent banners on every website. Conducting a safe data collection practice as a global business has become more and more challenging, pushing businesses to be more and more careful. Because of this landscape, a lot of businesses are looking for a “bullet-proof” way to analyze website users’s behavior. While Google Analytics is a data privacy nightmare, tools like Piwik Matomo try to justify their existence by claiming to be more privacy […]
Why I still love Props – Confessions of an Analyst
Experts are supposed to know everything about a certain topic. And not only are they suppose to know things, they also are expected to behave in the best way possible. Their past decisions are the benchmark for how to assess future situations and judge what to do. But all experts have their little secrets, where they deviate from the gold standard and do something that is outdated, unpopular, or straight-out embarrassing. This post is about one of the things that I still do today and only talk about seldom because it became unpopular a while ago. So here it is: I still use Props in all my Adobe Analytics implementations. But not only do I use them, I secretly love them! Both Adobe themselves and veterans like Jan “Props must die” Exner advise on not using them any more in the future. So this is my confession to the world […]
Retention Analysis in Adobe Analytics – Part 2: Custom Segments and Metrics
User Retention is crucial to any digital offering. If you optimize your offering to a point where users come back on their own, you can not only save on marketing cost but also engage your existing users more. This makes retention analysis a prime example for how digital analytics can provide tangible business value. In the previous post, we used Cohort Tables and some builtin features of Adobe Analytics to analyze User Retention. But there is a lot more Adobe Analytics has to offer once we start using Segments and Calculated Metrics. In this post we are going to build our very own Segments to see how many of our Users we are able to retain. Based on those Segments we will then define some Calculated Metrics to make our lives even easier. I’ve also put the results on the Open Adobe Analytics Components Repository. Let’s start building! Simple User […]
Retention Analysis in Adobe Analytics – Part 1: Cohort Tables and Builtin Functionality
User Retention is crucial to any digital offering. If you optimize your offering to a point where users come back on their own, you can not only save on marketing cost but also engage your existing users more. This makes retention analysis a prime example for how digital analytics can provide tangible business value. In this post, we are going to take a look at how we can analyze user retention with the most advanced digital analytics tool, Adobe Analytics. We are going to start in this post with the builtin analytics dimensions and metrics, then take a look at cohort tables, and in the next post even build our own Segments and Calculated Metrics to help us understand retention. Let’s get started! Builtin Retention Metrics and Dimension To start things off, we will take a quick look at what Adobe Analytics has to offer out-of-the-box to help us understand […]
Time Series Analysis through Moving Averages – Statistics in Adobe Analytics
In what has become one of the most read series on this blog I am showing some examples of what Adobe Analytics has to offer in regards to statistical analysis. In the previous posts we took a look at simple averages and standard deviations, regression analysis and even forecasting. In this post we are going to use a variation of the simple mean called moving average. When dealing with time series data we might encounter what is called “noisy data”. Instead of showing as a steady line our KPIs might go up and down from day to day, making it hard for us to judge where the general trend is headed. One way of solving this is through the regression modeling we did before, which gives us a straight approximation line. But what we can also do is average the data for a defined window along our series, which is […]
Summary Report Suites in Adobe Analytics
Ever since Adobe released Analysis Workspace in 2016, customers were looking for a way to combine data from multiple Report Suites within a single Project in Analysis Workspace. Earlier in 2020 we were finally able to pull data from more than one Report Suite in Workspace, since Report Suites could now be selected on a Panel level instead of for the whole project. While this feature is awesome and a huge improvement, users like myself still wanted to combine multiple Report Suites in only one Freeform Table within the same Panel. Unfortunately, we still have to invest some work to get a view like this, where the right Table combines data from the two Report Suites on the left: The screenshot above was taken from a Report Suite which was created to summarize data from other Report Suites. This can be done without any changes to the implementation or additional […]
Advanced Time Series Analysis through Linear Regression – Statistics in Adobe Analytics
Previously in this little series, we took a look at how we can describe our trended data by using the statistical Mean and Standard Deviation. While this works quite well for data that doesn’t change much over time, it is rather limited in regards to take trends into account. With this post, we are doing something about that issue by using Linear Regression techniques. At the end of this post, you will get an Analysis Workspace project like below, where we can judge trends in data and see changes over time: Let’s get our hands dirty! Limitations of Mean and Standard Deviation Before we start, I want to explain the problem outlined above a bit better. Please consider the following graph I generated with the Workspace from the previous post and some demo data: What we see is a clear trend in our data, since our daily Unique Visitors are […]
Simple Time Series Analysis through Standard Deviation – Statistics in Adobe Analytics
In my last post, we took a look at how Descriptive Statistical Analysis can help us understand our site performance using the simple Mean. I introduced the concept of conditional counters to help us identify our top- and bottom-performing sites. Today we are going to extend our knowledge of descriptive statistical methods by using Standard Deviation on trended data and apply conditional counters to it as well, but with a new spin. If conditional counters are new to you, it might help to check out that last post! As last time, we are setting ourselves a goal for this post. At the end, we want to have a nice workspace to help us understand our trended data better. We need a way to judge if the fluctuation in our data is within an expected range and how often it is not. This is what we are going to build: Let’s […]
Simple Mean and Conditional Counters – Statistics in Adobe Analytics
In my last post, we took a look at how we can predict the future through Regression Analysis with Adobe Analytics and visualize it in Analysis Workspace. While that was a quite advanced post, there are a lot of things we can do using basic statistical analysis. This is what we are going to look at in this post, exploring some ways to describe our data in a standardized way. At the end of this post, we want to describe our relative page performance for a website like this, showing us top- and low performing pages and how many there are of both: Describing ranked website performance relative to the Mean This first part will show how we can level-up our ranked reports. Let’s pretend we want to judge how certain pages on our website are performing. To do this, we might start with a simple table containing our Page […]
Predictive Regression Analysis – Statistics in Adobe Analytics
Adobe Analytics is awesome for analyzing historical data. Besides Segments, Drilldowns or Derived Metrics, it also offers some advanced statistical functions like Regression Analysis. Here are some examples for the different regression models that are available today: It would be really cool if we could use this functionality to predict the future with some regressive models! This is what this article is going to describe by using advanced calculated metrics. In the end, we want to have a graph like this, with the historical and future data in the same visualization: We will go through the whole process of generating a metric like shown above. If you just want the result, you can scroll down to the bottom of this article, where I show the complete metric. Let’s start! Statistics 101: Simple Linear Regression in Adobe Analytics To start things off, let’s remind ourselves what regression analysis does. To keep […]
So, is Web Analytics your dream job?
I’ve been working in Web Analytics for over a decade. During that time I had the pleasure to meet a lot of people: Analysts, product owners, marketeers, architects, developers, and so on. I hired a bunch of them, applied to others myself, onboarded and trained a whole lot over the years. No matter who I’ve been talking to, sooner or later, one type of question would always come up: Will this be fun? Am I going to be okay? There are a lot of articles out there focused on the skills needed to start with Web Analytics. As always, Google can help you find those (or go to Julien’s Blog if you want a recommendation). There also are some talking about the necessary mindset. With this one, I will try to give you an impression on the qualities I observed while talking to Web Analysts who love what they do. […]
Generating more business value with the Adobe Analytics dashboards App
The Adobe Analytics dashboards App has been out for some days now. It has been one of the most demanded features among Analytics users for years. Personally, I had to disappoint my business users for quite some time whenever they asked for an App. So naturally, I was quite happy when it finally came out. Before the app arrived, we had to build workarounds to enable people to take Analytics data wherever they go. At my company we utilized Power BI to pull data from Analytics and offer it in some form of mobile app. That was a huge pain, since we had to rebuild things we already had in Analysis Workspace and maintain two products. We also had to make huge compromises regarding interactivity with data and visualizations. I’m very happy we don’t need to do that any more! One of the concerns I had before I gained Beta […]
Trying out the new Adobe Analytics App
Adobe Analytics still is the most complete solution for Digital Analytics. But for years, there has been one thing missing: A mature way to use dashboards on the go, without using your computer. While Analytics is usable on mobile browsers on a technical level, it is not the best user experience for both Analysts and Business Users. This is why a real Mobile App has been one of the most requested features over the years. And guess what: Adobe just released one! Who this App is made for There is one important thing to know about this new App before diving into the features and interface. Let’s ask ourselves first who the target audience for this app is, because it most likely is not primarily made made for you if you are an Analyst. It is not made to offer the same feature set that Analysis Workspace offers and I’m […]
Building an Enterprise Grade OpenSource Web Analytics System – Part 7: Analytics Dashboard
This is the seventh part of a seven-part-series explaining how to build an Enterprise Grade OpenSource Web Analytics System. In this post we are building an Analytics Dashboard in Kibana for our data in Elasticsearch. In the previous post we build the connection from Kafka to Elasticsearch and Clickhouse to store the data. If you are new to this series it might help to start with the first post. We have come a long way in this series. We built everything from the client implementation with Snowplow to the processing and enrichment pipelines with Kafka and Python and stored all the data in Elasticsearch. Now it is time to make that data accessible in an appealing way to analysts and business users. The obvious solution for Elasticsearch is Kibana, which is developed by the same company and is designed to work perfectly with Elasticsearch! Webanalytics Dashboard in Kibana In Kibana, […]
Building an Enterprise Grade OpenSource Web Analytics System – Part 6: Data Storage
This is the sixth part of a seven-part-series explaining how to build an Enterprise Grade OpenSource Web Analytics System. In this post we are taking a brief look on what we can do with the data we collected and processed with Clickhouse. In the previous post we built a persisted visitor profile for our visitors with Python and Redis. If you are new to this series it might help to start with the first post. During this series we defined multiple topics within Kafka. Now we have different levels of processing and persistence available. If we want to keep any of it, we should put it in a persistent storage like a Data Lake with Hadoop or a Database. For this project, we are using Elasticsearch and dipping our toes in a database called Clickhouse for fun! Feeding Data into Elasticsearch From the previous part, we have a nice Kafka […]
Building an Enterprise Grade OpenSource Web Analytics System – Part 5: Visitor Profile
This is the fifth part of a seven-part-series explaining how to build an Enterprise Grade OpenSource Web Analytics System. In this post we are going to build a visitor profile to persist some of the data we track with Python and Redis. In the last post we processed the raw data using Python and wrote it back to Kafka. If you are new to this series it might help to start with the first post. Now that we have a nice processed version of our events, we want to remember certain things about our users. To do this, we are going to create a Visitor Profile in Redis as high performance storage. The process for persisting values will look like this: Building our Visitor Profile First things in this part, we are setting up a little helper script that will take our processed tracking events and flatten them. It looks […]
Building an Enterprise Grade OpenSource Web Analytics System – Part 4: Data Processing
This is the fourth part of a seven-part-series explaining how to build an Enterprise Grade OpenSource Web Analytics System. In this post we are building the processing layer to work with our raw log lines. In the last post we used Nginx and Filebeat to write our tracking events to Kafka. If you are new to this series it might help to start with the first post. At this part of the series, we have a lot of raw tracking events in our Kafka topic. We could already use this topic to store the raw loglines to our Hadoop cluster or a database. But it would be much easier later on to do some additional processing to make our life a litte easier. Since Python is the data science language today we will be using that language. The result will then be written to another Kafka topic for further processing […]
Building an Enterprise Grade OpenSource Web Analytics System – Part 3: Data Collection
This is the third part of a seven-part-series explaining how to build an Enterprise Grade OpenSource Web Analytics System. In this post we are setting up the tracking backend with Nginx and Filebeat. In the last post we took care of the client side implementation of Snowplow Analytics. If you are new to this series it might help to start with the first post. Now that we have a lot of data that is being sent from our clients, we need to build a backend to take care of all the events we want. Since we are sending our requests unencoded via GET, we can just configure our web server to write all requests to a logfile and send them off to the processing layer. Configuring Nginx with Filebeat In our last project we used a configuration just like the one we need. As web server, we used and will […]
Building an Enterprise Grade OpenSource Web Analytics System – Part 2: Client Tracking
This is the second part of a seven-part-series explaining how to build an Enterprise Grade OpenSource Web Analytics System. In this post we are setting up the Client Tracking using the Javascript tracker from Snowplow Analytics. In the last post we took a look at the system architecture that we are going to build. If you are new to this series it might help to start with the first post. When building a mature Web Analytics system yourself, the first step is to build some function into your app or website to enable sending events to the backend analytics system. This is called client side tracking, since we rely on the application to send us events instead of looking at logfiles alone. For this series we are going to look at website tracking specifically, but the same principles apply to mobile apps or even server side tracking. Almost every mature […]
Building an Enterprise Grade OpenSource Web Analytics System – Part 1: Architecture
Some time ago I wrote a litte series on how to amp up your log analytics activities. Ever since then I wanted to start another project building a fully fledged Analytics system with client side tracking and unlimited scalability out of OpenSource components. This is what this series is about, since I had some time to kill during Easter in isolation ? This time, we will be using a tracker on the browser or mobile app of our users instead of logfiles alone, which is called client side tracking. That will give us a lot more information about our visitors and allow for some cool new use cases. It also is similar to how tools like Adobe Analytics or Google Analytics work. The data we collect has then to be processed and stored for analysis and future use. As a client side tracker, we will be using the Snowplow tracker. […]
Analysis Workspace Hacks (AGE) – Metric Targets
This is a post in the Adam-Greco-Edition (AGE) series of posts. They aim to iterate on some great posts by Adam Greco, showing some different approaches to achieve similar things. In another great Post, Adam Greco showed how we can have Metric Targets in Analysis Workspace. His approach includes setting up a Data Source to import Goals to a Custom Event. This is a very nice approach, but has some serious limitations. Because it utilizes Data Sources, all their limitations apply (see documentation). Most importantly, data can not be deleted or changed once it has been imported. Also we need to sacrifice Custom Events for every Goal we set. The setup is also very involved and not suited for non-techie people. What I would like to have is a Goal Metric that does not use valuable Custom Events, is changeable over time, and understandable and usable by non-technical users. As […]
Analysis Workspace Hacks (AGE) – Average Daily Unique Visitors
This is a post in the Adam-Greco-Edition (AGE) series of posts. They aim to iterate on some great posts by Adam Greco, showing some different approaches to achieve similar things. In one of his posts Adam Greco shows a way to replicate the Daily Unique Visitors Metric from Reports & Analytics in Analysis Workspace. His approach involves creating a Calculated Metric for a given time range, summing up the Visitors for each day. There are some limitations to that approach. The obvious one is that we need a new metric for each date range we want to analyze; We can’t use a 7-day Metric if there are 8 days to analyze. Second, Visitors are not deduplicated but summed up over all days in the reporting window (just as in the old interface); So a Visitor visiting our site three times would be counted as three Visitors. Last, the name could […]
Analysis Workspace Hacks – Next and Previous Page Report
Analysis Workspace is the most capable solution for Web Analysts today. It allows us to switch between building a Dashboard or old-school Report or something in the middle on the fly. It has surpassed the old Reports & Analytics Interface in functionality and workflow effectiveness and leaves you longing for it once you start using different solutions. But there is one thing that is not that awesome in Analysis Workspace yet: Pathing. Once you activate Pathing for a custom prop, the old interface gives you Next and Previous Reports for that prop, just like with the Page Dimension: As a result we get a nice table with the Next or Previous Dimension Items for a given Item. Hacking Analysis Workspace’s Flow Visualizations The closest thing to that functionality is the Flow Visualization in Analysis Workspace. It allows us to see a Flow of Users between Dimension Items or even across […]
Analysis Workspace Hacks – Link Events on Page Reports
Adobe Analytics gives us two types of events to use for our tracking implementation. With Page Tracking (calling s.t() in Websites or trackState() in Apps) we are supposed to measure when a page has been viewed. If we want to measure interactions on a given page, we would use Custom Link Tracking (s.tl() in Web and trackAction() in Apps) for that. The reasoning behind that is quite simple. If there was only one function, we would either end up with increased Page Views for every on-page event or have to take care of the distinction ourself by using valuable props or eVars. So from a simplicity standpoint this approach makes a lot of sense. But there is one problem: When using Custom Link Tracking, you can not set a pageName for that call. Adobe Analytics just ignores whatever you set for the pageName, because pageNames only make sense in the […]
Building your own Web Analytics from Log Files – Part 6: Conclusion
This is the sixth part of the six-part-series “Building your own Web Analytics from Log Files”. In this series we built a rather sophisticated logging and tracking functionality for our website. We used OpenResty to identify and fingerprint our users via cookies, stored that information to log files which were shipped to Elasticsearch and visualized with Kibana. Web Analytics democratized By using those techniques, we are able to use what we already have (log file processing) to answer questions about our users. Under best conditions this doesn’t even lead to a bigger technical footprint. This way we can have deep insights into our user behavior without external tools. Even as a startup or hobby developer you are now able to put the user first on your digital platforms. Next steps While this series is done for now we have a starting point to further build our platform. With some frontend […]
Building your own Web Analytics from Log Files – Part 5: Building our first Dashboard
This is the fifth part of the six-part-series “Building your own Web Analytics from Log Files”. At this part of the series we have our log files in Elasticsearch with indices like “custom-filebeat-tracking-logs-7.4.0-2020.01.03”. First thing is to set up a Kibana index pattern for this. Kibana Configuration In Kibana we go to Management -> Index Patterns -> Create index pattern. As Index pattern we use “custom-filebeat-tracking-logs-*”, which gives us all the indices with our daily index pattern. In the next step, we set the Time Filter field name to “@timestamp”. This is the timestamp that marks the point where Filebeat indexed the document. This is fine for now, we click “Create index pattern” and are done with this part! Checking our Data Now, let’s head to the Discover section in Kibana and look at our index pattern. And there it is: Our log entries show up like we wanted: This […]
Building your own Web Analytics from Log Files – Part 4: Data Collection and Processing
This is the fourth part of the six-part-series “Building your own Web Analytics from Log Files”. Legal Disclaimer: This post describes how to identify and track the users on your website using cookies, IP adresses and browser fingerprinting. The information and process described here may be subject to data privacy regulations under your legislation. It is your responsibility to comply with all regulations. Please educate yourself if things like GDPR apply to your use case (which is very likely), and act responsibly. In the last part we have built a configuration for OpenResty to generate user and session IDs and store them in browser cookies. Now we need a way to actually log and collect those IDs together with the requests our web server handles. OpenResty Configuration To be able to log our custom variables we need to announce them to Nginx. This is done right in the server-part of […]
Building your own Web Analytics from Log Files – Part 3: Setting up Nginx with OpenResty
This is the third part of the six-part-series “Building your own Web Analytics from Log Files”. Legal Disclaimer: This post describes how to identify and track the users on your website using cookies and browser fingerprinting. The information and process described here may be subject to data privacy regulations under your legislation. It is your responsibility to comply with all regulations. Please educate yourself if things like GDPR apply to your use case (which is very likely), and act responsibly. Identifying Users and Sessions One of our goals for this project is to be able to tell how many people are using our site. This means we need a way to differentiate between the users on our site. One approach would be to look at the IP addresses of our users. This is not very precise since all devices with the same internet connection share an IP address. Especially for […]
Building your own Web Analytics from Log Files – Part 2: Architecture
This is the second part of the six-part-series “Building your own Web Analytics from Log Files”. Architecture Overview To start of this series, let’s remember what we want to achieve: We want to enable a deeper understanding of our website users by enriching and processing the log files we already collect. This article looks at the components we need for this and how to make our life as easy as possible. To achieve our goal, we need to teach our web server to identify our users, store information about the activity in the log files, ship those files to storage and make it actionable with a way of visualizing it. Because I believe in Open Source Software, we will look at our options among that category. Another requirement is to introduce as less components as possible and keep scalability in mind. Choosing our Web Server The first part of our […]
Building your own Web Analytics from Log Files – Part 1: Motivation
This is the first part of the six-part-series “Building your own Web Analytics from Log Files”. What is Web Analytics As the owner or administrator of a website, you will go through different phases of maturity. When you are just starting with a hobby or web project, you will most likely care about the technical setup and gaining traction. Once everything is up and running, you will start asking yourself questions like How many People are using my website? How many of those are new Visitors? Which page on my website attracts the most (new) Visitors? Those questions are Web Analytics questions. It is what Web Analysts spent their time on to deliver value to the business behind it. To achieve that, we most commonly use tools like Piwik (Matomo), Google Analytics, or Adobe Analytics. Those tools rely on some Javascript code that needs to be integrated on a website […]