Peanut Butter & Jelly Sandwich

If you want to test with Google Optimize, you must pair it with Google Analytics (it is a prerequisite). They are like peanut butter & jelly – way better together. Optimize test data in Analytics allows for deep dive analysis of your experiments that is not possible in Optimize reporting.

First, the basics

Google Optimize allows you to test web page variants and see how they perform against each other relative to an objective that you specify in a test hypothesis. Optimize reports the results of your test so you can tell which variant is better. You create Optimize website customizations in what’s called an “experience”, which can be either a test or a personalization intended to achieve a desired outcome.

NOTE: Optimize documentation interchangeably uses the terms “test” and “experiment”.

Tests or personalizations

A personalization is a set of changes made to your digital property for a specific group of users. Unlike tests, personalizations can run indefinitely and do not include variants. They’re a single set of changes served to all users that meet the targeting conditions.

Tests include 2 or more variants (including the original), and come in three varieties: A/B test (sometimes referred to a A/B/n test), Redirect test (also known as a split URL test) and a Multivariate (MVT) test.

Ideally you would identify what to turn into a personalization via winning Optimize test variants. But, there is no test prerequisite for a personalization. If it is clear a change for a target population of users will improve measured success, launch a personalization straight away.

Linking an Analytics view to an Optimize experience

Because all metrics reported in Optimize are first processed by Analytics and then pushed to Optimize, obviously the Analytics view you link to Optimize will govern what user data is in your reported potential experience population. For example, if you exclude users browsing from your company network in the Analytics view you link to Optimize, the related experience will also exclude those users from the test. However, unless you are targeting with an Optimize 360 audience that does not include a user, if that user would be excluded from the linked Analytics view – but otherwise meets the test targeting conditions, they will still see a test variant even though their related data won’t be reported.

The potential objectives in your Optimize experience are made up of the Analytics goals configured in the linked view, as well as system objectives as measured in the linked Analytics view (e.g., bounces, pageviews, session duration).

Optimize test data as dimensions in Analytics

Optimize test data is available in a few Analytics dimensions that are session-scoped. However – unlike other session-scoped dimensions – if a user is included in multiple tests in the same session, the test dimensions will have multiple values that don’t overwrite the others. Test dimension values will be set on all users sessions meeting the test targeting conditions (“Experiment sessions“).

Test dimensions are available as Analytics secondary dimensions in reports, and can be leveraged in advanced segments as well as custom reports. Test dimensions in Analytics include:

  1. Experiment Name – the name you give your experience in Optimize.
  2. Experiment ID – A unique ID available in the “Measurement” section of the Optimize experience details page (e.g., J5QqakdjQr24IQjfDIbTWw).
  3. Experiment ID with Variant – The Experiment ID with the Variant ID appended to it (colon delimited). Available in the “Variants” section of the Optimize experience details page. (e.g., J5QqakdjQr24IQjfDIbTWw:1).

Optimize personalization data as events in Analytics

For personalizations, if you have the measurement checkbox engaged, Optimize also sends non-interactive event hits to Google Analytics with each experiment hit (which count against Optimize 360 hit limits). The following Analytics event tracking fields are populated by Optimize:

  • eventCategory = “Google Optimize”
  • eventAction = {Experience ID} – This ID is available in the “Measurement” section of the Optimize experience details (e.g., J5QqakdjQr24IQjfDIbTWw).

Why doesn’t my Analytics & Optimize test data match?

Assuming a best practices Optimize snippet installation, there are usually three primary causes for Analytics & Optimize reported test data not matching: sampling, reporting delays or nuances in conversion rate calculations.

Sampling

Optimize data is not sampled, but Google Analytics very well may have sampled report data – which can be a source of difference in reported test data. If you are a paid license Analytics 360 customer, you can request an unsampled Analytics Experiments report. If you use the free Analytics product, read Google Analytics Sampling and Row Limits 101 to understand how & when sampling might impact your reported Analytics data, including Optimize test dimensions & personalization events in Analytics. Note that sampling is more common if you include “today’s date” or overlay an advanced segment on the report.

Reporting delays

As noted, all that is reported in Optimize is first processed by Analytics and then pushed to Optimize. This process can take up to 12 hours. As such, you will usually see more test sessions in Analytics than in Optimize. Also, when you end a test, data collected between the last push to Optimize and when you end the test will not be pushed to Optimize, but will be available in Analytics.

Conversion rate calculations

The conversion rate references you see in Optimize are related to modeled conversion rates (Bayesian magic is at play). However, the conversion rates you see for the same test in Analytics are literal observed conversion rates (conversions / test sessions).

Additional reference links

How the Optimize runtime works
Optimize measurement methodology
Optimize snippet install best practices
Optimize server-side experiments
Optimize glossary of terms
Measuring Google Optimize with Google Analytics