Search

24 . 8 . 17

Analytics Regression & Validation


As part of any website migration project, it is absolutely critical that analytics continues to track as expected. Often, when a website migrates to a completely new CMS, with new content and new pages and URLs, this will mean certain tracking that adhered to the code and content on the previous CMS may no longer fire.

In this post, I am sharing the process our data team goes through as part of our regression and validation exercises.

1) Audit what is currently in Analytics before the change

At twentysix, we ensure that a representative from the data team is present from the start of projects and available throughout the delivery cycle. This ensures that the analytics auditing is done at the beginning. And where required, appropriate data can be extracted to help inform project decisions.

A good technical analytics audit should be able to identify – not just what is currently being tracked, (and what should continue to track post-migration) but also any other legacy issues already present in the set up that can be addressed at the same time.

example - analytics audit

example – analytics audit

In a nutshell, our technical audit looks at the following 3 main components:

Acquisition

  • Are channels reporting as expected – this will mean checking for any mis-configuration of custom channel settings, any mis-attribution of traffic, are campaign UTM parameters accurate, are there self-referral issues, considerations for multiple domains and so on.

Behaviour

  • Here, we look at how pages are presently reporting, whether hits are taking too long to fire, are query parameters unnecessarily fragmenting page reports, could filters be better utilised, what events are currently being tracked, has there been any non-interaction events that have been configured as interaction and subsequently unintentionally impacting bounce rate and so on.

Conversions

  • As part of this we look at how the website measures conversions. If it is a lead-gen website, we check for how goals have been implemented, are they better off as events rather than destination urls. If the website is transactional, we inspect the ecommerce implementation, segmenting total revenue to check for discrepancies in figures, checking for currency, duplicate transactions, and so on.

2) Review Google Tagmanager Configuration

For websites already using Google Tagmanager, we also review how this is currently configured.

This helps us to:

  • Identify how certain tags have been implemented
  • Understand which tags will need to be re-configured post-migration
  • Include considerations for third party marketing tags (such as Adwords Conversion tags or Doubleclick Floodlight or other Affiliate tags)
example - floodlight tag values1

example – floodlight tag values1

When performing a GTM review, we will look at how the following are currently being utilised:

  • Pageview tags
  • Event Tags
  • Utility Tags (these are tags that contain custom requirements to enable other tags to fire)
  • Triggers (checking that the present triggers are still relevant post-migration otherwise, a consideration for re-configuration)
  • Variables (ensuring current variables can continue to output the expected values post-launch.)

3) Configuration Plan

Findings from the above exercises are then compiled into a configuration planning document. This gives us clear visibility of what is required and outlines our findings from the analytics and Tagmanager audits.

This configuration plan will also include:

  • Account / Property / View set up (do these need cleaning up, do we continue to use the same current structure or will they have to move to a new property? Do we need to consider setting up a view just for testing?)
  • User access management
  • Requirements for custom reports / dashboards
  • Current infrastructure set up: eg if there are any third party integrations to be considered as part of the migration (eg. forms connected with Eloqua)

4) Working with Developers

Instead of waiting until right at the end of the project delivery cycle to tackle analytics requirements, we work in parallel with the development stream for a more efficient delivery. This reduces back and forth and developers can have considerations for particular requirements earlier on.

At this stage we ensure:

  • The data team have appropriate access to staging environments to test tracking
  • The data team supplies the developer briefs to assist developers in tracking implementation
  • An agreement on points within the delivery schedule to ensure the data team have ample time to test on staging and again when the site goes live

5) Performing the regression & validation exercise

Now we begin the regression & validation exercise! In this exercise, a trained data architect will perform a series of tests using a number of defined actions that enable the tracking of items required from the configuration plan. These values are recorded in a version-controlled regression log to track errors, bugs and changes.

In particular, the values we test for are within:

  • the dataLayer
  • the analytics hit (whether pageview, event, transaction etc)
  • the Google Tagmanager preview  – to check for the appropriate triggering of expected tags
example - enhanced ecommerce datalayer and analytics transaction hit

example – enhanced ecommerce datalayer and analytics transaction hit

 

There are several types of actions that we will test on and some of these include:

  • Form submits (ensuring event fire on true successfully-validated form completions)
  • Test Transactions (eg. single or multiple products, ensuring individual products total up to ecommerce revenue)
  • Interactions (such as downloads, video views and other action are appropriately firing the expected events)
example - search event - analytics event

example – search event – analytics event

The regression exercise is done at least twice – once in staging and again once the website goes live.

6) Continuous monitoring

It is recommended to monitor the data coming into analytics at least for the next few weeks after migration.

There might be teething problems post-launch so it is a good idea to set up custom alerts as a way of keeping an eye on key areas in your analytics reports.

I hope you’ve found this post useful!

Maximize business opportunity with data-driven decision making. Find out more about our data team.

Trent Y. is the Analytics & Conversion Director at twentysix. Analytics, usability and conversion strategies. Loves coffee, interested in how data and technology can positively impact our lives. “Learn fast, learn often.”

Share This:


Comments