When a new A/B split or goes live, there are multiple factors at play which result in reports that look like these:
Why does this happen?
Reports show so much variation in the beginning because even small changes in goals result in large percentage changes (and the default view of Visual Website Optimizer reports is the percentage view). As more data is collected, the reports also tend closer to the correct conversion rate. The screenshot below illustrates how this works.
You see winners earlier in the test because the data says so. However, you must understand that initial data is based on the actions of a few visitors and is not a true representation of the behavior of your site’s traffic. Additionally, in the real world, A/B tests reports can vary in the beginning because regular users see something new on the website and behave differently.
How to avoid making incorrect decisions
Here are some general guidelines to ensure you don’t make wrong decisions:
- Use our A/B test duration calculator to calculate for how long you should run your test. Even if your test is 99% significant, let it run for at least a week.
- If you see very large variances in the beginning, look at the Day-wise report instead of Cumulative report to get a better understanding of what’s happening:
- To be completely sure, wait for tests to achieve 99% significance.