User Retention is crucial to any digital offering. If you optimize your offering to a point where users come back on their own, you can not only save on marketing cost but also engage your existing users more. This makes retention analysis a prime example for how digital analytics can provide tangible business value. In this post, we are going to take a look at how we can analyze user retention with the most advanced digital analytics tool, Adobe Analytics. We are going to start in this post with the builtin analytics dimensions and metrics, then take a look at cohort tables, and in the next post even build our own Segments and Calculated Metrics to help us understand retention. Let’s get started! Builtin Retention Metrics and Dimension To start things off, we will take a quick look at what Adobe Analytics has to offer out-of-the-box to help us understand […]
Category: Tutorial
Adobe Analytics Introduction: Terms and Concepts
This is one of several post aiming to give an introduction into Adobe Analytics. They are intended as both tutorials and references for future use. While there already are a lot of good sources for this, some are quite dated and miss connections to recently released features and enhancements. In this post, I will explain some general things that are helpful to know when starting with Adobe Analytics. We will go over different interfaces to analyze data, explain Dimensions, Metrics, and Events and name some common integrations. Know what you are looking at: Dimensions One of the most important building blocks of Adobe Analytics are Dimensions. With Dimensions, we capture descriptive values on our websites or in our apps. Many people call them variables when explaining the general concept. On a website we might record the name of a certain page in a dimension. This would allow us to report […]
Integrating Adobe Target and Adobe Analytics into Voice Assistants
With digital experiences on the rise, interactive Voice Assistants like Amazon Alexa, Google Home or Apple’s Siri are still gaining popularity. Companies now need to meet their customers expectations and allow them to interact with their brand however they like. Those new possibilities require a clear strategy to avoid wasting time and resources on products which nobody actually uses. Adobe Analytics can help understand digital experiences better and drive value through customer feedback. With Adobe Target for personalization and experimentation, nothing can stop you from delivering the right experience at the right time. This article describes how both Analytics and Target can be integrated in Voice Assistants’ backend systems to track and test how users are interacting with your App. We will be using the a direct integration with the Experience Cloud ID Service, sync identifiers and use them for Analytics and Target. Target will then be used to personalize […]
Getting Google Search Keywords into Adobe Analytics
While Adobe Analytics is a much more mature solution compared to Google Analytics, the latter always had an advantage when it comes to Search Keywords. It shouldn’t surprise us that the company who offers both the search engine and the analytics tool has some integration between them. While it was easy to get search keywords from the target URL in the past, those times are gone for years now. Ever since then, business were struggling to know what their visitors were initially looking for when they came to their webpage. This article outlines a couple of ways on how to achieve this in Adobe Analytics. For this post we will take a look at the integration Adobe offers to Analytics Prime customers called Advertising Analytics. Right after that we are going to build our own integration based on the same method to get some insight into Google Ads performance. To […]
Tutorial: Real time Product Recommendations with Adobe Target
Adobe Target Premium can be used for sophisticated product or content recommendations. Building on the last post, we are going to build our very own recommendation engine based on user behavior in real time. I assume you’ve read that post, but feel free to go back and read it first! Starting with recommendations As we know from the last blog post, Adobe Target is well equipped to personalize content in real time. To do this, we sent information about the products and users to Target in mbox requests. Now we are going to use the same concept to give real product recommendations! To get started, we need to understand a few basic concepts. First, Target must know about your products. We can either give details with the mbox requests or upload them in different types of feeds. What we end up with is a catalog of products with some information. […]
Tutorial: Real time dynamic Personalization with Adobe Target
Adobe Target is part of the Adobe Experience Cloud. It can be used for A/B Testing and Personalization of Websites, Apps and Server Side Applications. This article describes how it can be used to create dynamic Experiences based on the user profile and mbox parameters. Personalization is not a feature, it’s obligatory Some years ago, the internet was not more than a collection of static pages. It was perceived as the worlds largest library, so websites were created the same way as a library: Books don’t change their content over time, or react to the reader. Much like in a library, search indices like Google helped to navigate the web and find the best content. Over time, websites started to make better use of what computers are capable of. Animated images and blinking text rivaled for the user’s attention. If you got an email a human voice would tell you […]
Analysis Workspace Hacks (AGE) – Metric Targets
This is a post in the Adam-Greco-Edition (AGE) series of posts. They aim to iterate on some great posts by Adam Greco, showing some different approaches to achieve similar things. In another great Post, Adam Greco showed how we can have Metric Targets in Analysis Workspace. His approach includes setting up a Data Source to import Goals to a Custom Event. This is a very nice approach, but has some serious limitations. Because it utilizes Data Sources, all their limitations apply (see documentation). Most importantly, data can not be deleted or changed once it has been imported. Also we need to sacrifice Custom Events for every Goal we set. The setup is also very involved and not suited for non-techie people. What I would like to have is a Goal Metric that does not use valuable Custom Events, is changeable over time, and understandable and usable by non-technical users. As […]
Analysis Workspace Hacks (AGE) – Average Daily Unique Visitors
This is a post in the Adam-Greco-Edition (AGE) series of posts. They aim to iterate on some great posts by Adam Greco, showing some different approaches to achieve similar things. In one of his posts Adam Greco shows a way to replicate the Daily Unique Visitors Metric from Reports & Analytics in Analysis Workspace. His approach involves creating a Calculated Metric for a given time range, summing up the Visitors for each day. There are some limitations to that approach. The obvious one is that we need a new metric for each date range we want to analyze; We can’t use a 7-day Metric if there are 8 days to analyze. Second, Visitors are not deduplicated but summed up over all days in the reporting window (just as in the old interface); So a Visitor visiting our site three times would be counted as three Visitors. Last, the name could […]
Analysis Workspace Hacks – Next and Previous Page Report
Analysis Workspace is the most capable solution for Web Analysts today. It allows us to switch between building a Dashboard or old-school Report or something in the middle on the fly. It has surpassed the old Reports & Analytics Interface in functionality and workflow effectiveness and leaves you longing for it once you start using different solutions. But there is one thing that is not that awesome in Analysis Workspace yet: Pathing. Once you activate Pathing for a custom prop, the old interface gives you Next and Previous Reports for that prop, just like with the Page Dimension: As a result we get a nice table with the Next or Previous Dimension Items for a given Item. Hacking Analysis Workspace’s Flow Visualizations The closest thing to that functionality is the Flow Visualization in Analysis Workspace. It allows us to see a Flow of Users between Dimension Items or even across […]
Analysis Workspace Hacks – Link Events on Page Reports
Adobe Analytics gives us two types of events to use for our tracking implementation. With Page Tracking (calling s.t() in Websites or trackState() in Apps) we are supposed to measure when a page has been viewed. If we want to measure interactions on a given page, we would use Custom Link Tracking (s.tl() in Web and trackAction() in Apps) for that. The reasoning behind that is quite simple. If there was only one function, we would either end up with increased Page Views for every on-page event or have to take care of the distinction ourself by using valuable props or eVars. So from a simplicity standpoint this approach makes a lot of sense. But there is one problem: When using Custom Link Tracking, you can not set a pageName for that call. Adobe Analytics just ignores whatever you set for the pageName, because pageNames only make sense in the […]
Building your own Web Analytics from Log Files – Part 6: Conclusion
This is the sixth part of the six-part-series “Building your own Web Analytics from Log Files”. In this series we built a rather sophisticated logging and tracking functionality for our website. We used OpenResty to identify and fingerprint our users via cookies, stored that information to log files which were shipped to Elasticsearch and visualized with Kibana. Web Analytics democratized By using those techniques, we are able to use what we already have (log file processing) to answer questions about our users. Under best conditions this doesn’t even lead to a bigger technical footprint. This way we can have deep insights into our user behavior without external tools. Even as a startup or hobby developer you are now able to put the user first on your digital platforms. Next steps While this series is done for now we have a starting point to further build our platform. With some frontend […]
Building your own Web Analytics from Log Files – Part 5: Building our first Dashboard
This is the fifth part of the six-part-series “Building your own Web Analytics from Log Files”. At this part of the series we have our log files in Elasticsearch with indices like “custom-filebeat-tracking-logs-7.4.0-2020.01.03”. First thing is to set up a Kibana index pattern for this. Kibana Configuration In Kibana we go to Management -> Index Patterns -> Create index pattern. As Index pattern we use “custom-filebeat-tracking-logs-*”, which gives us all the indices with our daily index pattern. In the next step, we set the Time Filter field name to “@timestamp”. This is the timestamp that marks the point where Filebeat indexed the document. This is fine for now, we click “Create index pattern” and are done with this part! Checking our Data Now, let’s head to the Discover section in Kibana and look at our index pattern. And there it is: Our log entries show up like we wanted: This […]
Building your own Web Analytics from Log Files – Part 4: Data Collection and Processing
This is the fourth part of the six-part-series “Building your own Web Analytics from Log Files”. Legal Disclaimer: This post describes how to identify and track the users on your website using cookies, IP adresses and browser fingerprinting. The information and process described here may be subject to data privacy regulations under your legislation. It is your responsibility to comply with all regulations. Please educate yourself if things like GDPR apply to your use case (which is very likely), and act responsibly. In the last part we have built a configuration for OpenResty to generate user and session IDs and store them in browser cookies. Now we need a way to actually log and collect those IDs together with the requests our web server handles. OpenResty Configuration To be able to log our custom variables we need to announce them to Nginx. This is done right in the server-part of […]
Building your own Web Analytics from Log Files – Part 3: Setting up Nginx with OpenResty
This is the third part of the six-part-series “Building your own Web Analytics from Log Files”. Legal Disclaimer: This post describes how to identify and track the users on your website using cookies and browser fingerprinting. The information and process described here may be subject to data privacy regulations under your legislation. It is your responsibility to comply with all regulations. Please educate yourself if things like GDPR apply to your use case (which is very likely), and act responsibly. Identifying Users and Sessions One of our goals for this project is to be able to tell how many people are using our site. This means we need a way to differentiate between the users on our site. One approach would be to look at the IP addresses of our users. This is not very precise since all devices with the same internet connection share an IP address. Especially for […]
Building your own Web Analytics from Log Files – Part 2: Architecture
This is the second part of the six-part-series “Building your own Web Analytics from Log Files”. Architecture Overview To start of this series, let’s remember what we want to achieve: We want to enable a deeper understanding of our website users by enriching and processing the log files we already collect. This article looks at the components we need for this and how to make our life as easy as possible. To achieve our goal, we need to teach our web server to identify our users, store information about the activity in the log files, ship those files to storage and make it actionable with a way of visualizing it. Because I believe in Open Source Software, we will look at our options among that category. Another requirement is to introduce as less components as possible and keep scalability in mind. Choosing our Web Server The first part of our […]
Building your own Web Analytics from Log Files – Part 1: Motivation
This is the first part of the six-part-series “Building your own Web Analytics from Log Files”. What is Web Analytics As the owner or administrator of a website, you will go through different phases of maturity. When you are just starting with a hobby or web project, you will most likely care about the technical setup and gaining traction. Once everything is up and running, you will start asking yourself questions like How many People are using my website? How many of those are new Visitors? Which page on my website attracts the most (new) Visitors? Those questions are Web Analytics questions. It is what Web Analysts spent their time on to deliver value to the business behind it. To achieve that, we most commonly use tools like Piwik (Matomo), Google Analytics, or Adobe Analytics. Those tools rely on some Javascript code that needs to be integrated on a website […]