A/B testing

Overview

After you set up your A/B test in Sitefinity CMS, you track and measure the results in the automatically generated reports in Sitefinity Insight. After you start the A/B test and once any of the page variations or the original page is visited, Insight collects the data and displays detailed reports about the current state of test results:

  • Number of visitors exposed to the test and to each of the page variations
  • Number of goal completions and the goal completion rate for each of the page variations
  • Improvement in conversion rate of each variation
  • Statistical significance
  • Winner variation, if any

For details about A/B tests and how to manage them in Sitefinity CMS, see Work with A/B tests.

IMPORTANT Your connection to Sitefinity CMS must be properly configured, so that Sitefinity Insight collects data. If connection is lost, the A/B test may still be going on, but Sitefinity Insight does not track and accumulate data. In this case, you are also not able to manage the A/B test. For more information, see Troubleshoot connection issues.

A/B tests reports

To see a list of all A/B tests reports, open the A/B tests page.
Tests are ordered by the date when the reports for the respective test were first generated. That is, when any of the page variations for a specific test was visited and data accumulated.

You can choose whether to see all tests or test with one of the following statuses:

  • Active
    The test is started and is accumulating data.
  • Stopped
    The test is on hold and no new data is accumulated at the moment.
    Once you resume the test, Sitefinity Insight continues to collect data.
  • Ended
    There is a page variation already published as default content for the page. You can still examine the data accumulated for this test up until it ended.

NOTE: You can also access an A/B test for a specific page from Sitefinity CMS.
Next to each page, which you have set up an A/B test for, click the Results link to go directly to the respective A/B report.

Summary

For each A/B test, you see the following summary:

  • Whether there is winner page variation
  • The date when the test started
  • For how many days the test is active

    NOTE: In case you stopped the test for some time, this metric displays the sum of all days during which the test was active – before you stopped the test and after you resumed it.

  • Total visits for all page variations
  • Target audience
    You can see this field only if you have created the A/B test with visitor segments.

The Goal completion rate graph visualizes how goal completion rate changes over time for each page variation. Hovering over each point on the graph drills into the details of the rate for a specific date. In addition, if you added more than one goal, you can select the goal whose completion rates you want to analyze.

Results and statistical significance

A/B test results give you the key metrics you need to decide which page variation to use as default. You may use the Winner variation, if there is a winner, or any other variation, based on what you aim at with your scenario.

Following is a list with all metrics for each variation and detailed description.

Metric Description
Goal completion rate The goal completion rate is the ratio between the number of visitors and the actual goal completions on a specific page variation. That is, the percentage of how many visitors completed the goal.
Goal completions The number of visitors on this page variation that completed the goal you set.
Visitors The overall number of visitors on this page variation.
Improvement

Improvement percentage is calculated based on the Baseline, that is, the goal completion rate of the original page.

NOTE: The Improvement percentage can be negative in case the page variation is performing worse than the original.

In some cases, this metric may show:

  • Yes
    The original page has visitors but no goal completions. This means that any goal completions on page variations are an improvement.
  • No
    The original page and its variations have visitors but none completed the goal.
Statistical significance

Statistical significance of the results is based both on number of visitors and conversion rate, thus making it an appropriate measure for significance of test results. It is calculated by using the p-value, a number between 0 and 1. The confidence level aimed at is at least 95%, that is 0.05 p-value is the benchmark for significance of test results. Therefore:

  • High significance: p-value ≤ 0.05 or

    ≥ 0.95
    Based on industry-standard best practices, you can take test results as credible for making your decision.

  • Low significance: p-value > 0.05 or < 0.95
    The improvement, if any, is statistically insignificant based on the conversions and number of visitors. You can still make a decision to select a winner based on the conversion rate, but you will be taking a risk since the results will most probably change in the future.
Winner

The winner of the A/B test is determined no earlier than two weeks after test start, so that enough data is accumulated. If your test lasts less than two weeks, you decide on a winner, based on the results thus far.

NOTE: Winner variation is identified only for primary goals.

Winner is identified only when there are results with High statistical significance:

  • The variation with highest improvement %
  • The original page in case the variations are with negative improvement %

Was this article helpful?