The Complete Guide to A/B Testing
What is A/B Testing?
Businesses want visitors to take an action (conversion) on the website and the rate at which a website is able to drive this is called its "conversion rate."
A/B testing is the practice of showing 2 variants of the same webpage to different segments of website visitors at the same time and comparing which variation drives more conversions. The one that gives higher conversions wins!
The metrics for conversion are unique to each website.For ecommerce, it may be the sale of the products, while for B2B, it may be generation of qualified leads
How Does A/B Testing Work?
A well-planned, data-driven A/B test can make your marketing efforts more profitable by narrowing down on its most important elements through testing and then combining them. Broadly, it includes the following steps:
- Step 1: Performing research
Use website analytics tools such as Google Analytics, heatmaps, surveys, and user tests to collect data on visitor behavior and track metrics. Make observations to identify the problem area in your conversion funnel and find out what is stopping visitors from converting.
- Step 2: Hypothesis formulation
Based on the insights from your research, build a hypothesis aimed at increasing conversions.
- Step 3: Creating a variation
Create a variation based on your hypothesis, and A/B test it against the existing version. Calculate the test duration keeping in mind your monthly visitors, current conversion rate, and the expected change in the conversion rate. (Use our Bayesian Calculator here.)
- Step 4: Testing
Flag off the test and wait for the stipulated time for achieving statistically significant result.
- Step 5: Analyzing results and drawing conclusions
Analyze the test results and, if it succeeds, deploy the winning variation. If the test remains inconclusive, draw insights from it and implement these in your subsequent test.
A/B testing lets you systematically work through each part of your website to improve conversions.
Watch this webinar to know how to scale your testing program.
What Can You A/B Test?
Every element on your website that can influence visitor behavior and conversion rate can and should be A/B tested.This could be:
- Sub headlines
- Paragraph Text
- Call to Action text
- Call to Action Button
- Content near the fold
- Social proof
- Media mentions
- Awards and badges
A/B testing is not limited to minor tweaks to CTA button color, text, or headlines only. You can test an entire page design, its layout, content, and even functionality.
Why Should You A/B Test?
A B2B business may be loaded with unqualified leads, an ecommerce business may be struggling with a high cart abandonment rate, while a media and publishing business may be dealing with low reader engagement.
Let’s see why A/B testing is one of the most effective ways to deal with all these problems:
- Solve visitor pain points: Use data gathered through visitor behavior analysis tools such as heatmaps, Google analytics, and surveys to solve your visitors’ pain points. This stands true for all businesses, be it ecommerce, travel, SaaS, education, or media, and publishing.
- Get more conversion by investing less: The cost of acquiring paid traffic can be huge. A/B testing lets you make the most out of your existing traffic and helps you increase conversion without having to spend on acquiring new traffic. Plus the ROI from A/B testing can be significant with minor changes resulting in a significant increase in conversions.
- Reduce bounce rate: With A/B testing, you can test multiple variations of an element of your website till you find the best possible version. Because of this, your content quality improves, making visitors spend more time on your website and reduces bounce rates.
- Make low risk modifications: Make minor, incremental changes to your website with A/B testing instead of getting the entire site redesigned. This can reduce the risk of jeopardizing your current conversion rate. A/B testing lets you target your resources for maximum output with minimal modifications, resulting in increased ROI.
- Data driven: Because A/B testing is completely data driven with no room for guesswork, gut feelings, or instincts, you can easily determine a “winner” and a “loser” based on metrics like time spent on the page, number of demo requests, cart abandonment rate, click-through rate, and so on.
- Redesigning your website: Redesigning can range from a minor CTA text or color tweak to a complete revamping of the website. The decision to implement one version or the other should always be backed by data-driven A/B testing. Do not quit testing with the design being finalized. As the new version goes live, test other elements of your webpage to make sure that the most engaging version is being served to the visitors.
- Changing the product pricing: Perform an A/B test when you plan to remove or update your product prices. You do not know if your visitors are going to react positively to the change or not and A/B testing is one way to ascertain which side the weighing scale will be tilted.
- Feature change: Never change a feature or service on your website without A/B testing, especially if the changes affect customer data or purchase funnel. Changes without testing may or may not pay off. Testing and then making changes can make the outcome certain.
Common Mistakes While A/B Testing
- Not planning your optimization roadmap well:
-Invalid hypothesis: In A/B testing, hypothesis is formulated before conducting a test. All the next steps depend on it: what should be changed, what the expected outcome is, why, and so on. If you start with the wrong hypothesis, the probability of the test succeeding decreases.
-"Taking others word for it": Sure someone else changed their sign-up flow and saw a 30% uplift in conversions. But it is their test result based on their traffic, their hypothesis, and their goals. Here’s why you should not implement someone else’s test results as is onto your website: first, no two websites are the same - what worked for them might not work for you - and second, there is no way to know if the results they have reported for you to read are accurate.
- Testing too many elements together: Industry experts caution against running too many tests at the same time. Testing too many elements of a website together makes it difficult to pinpoint which element influenced the success or failure of the test most. Prioritization is indispensable for successful A/B testing.
- Ignoring statistical significance: If gut feelings or personal opinions find a way into hypothesis formulation or setting of goals for an A/B test, it is most likely to fail. To have an A/B test succeed or to have a failed A/B test that gives you valuable insights for your next test, your test should be guided by statistically significant data from the start. Learn more about statistical significance here.
- Using low traffic: Businesses often end up testing wrong traffic. A/B testing should be done with the appropriate traffic data to get significant results. Using low traffic for testing increases the chances of getting inconclusive results. Learn how to calculate the percentage of your traffic to be included in an A/B test here.
- Wrong timing: Based on your traffic and goals, run A/B tests for a certain length of time for it to achieve statistical significance. Running a test for too long or too short a time period can result in the test failing or producing insignificant results. Because one version of your website appears to be winning within the first few days of starting the test does not mean that you should call it off before time and declare a winner. Learn how long you should run your test for here .
- Failing to conduct follow-up experiments: A/B testing is an iterative process with each test building upon the results of the previous tests. Businesses give up on A/B testing after the first test fails. But to improve the chances of your next test succeeding, you should draw insights from your last tests while planning and deploying your next test. This improves the probability of your test succeeding with statistically significant results.
Watch this webinar to know how to use learning or insights from failed A/B tests to create winning tests.
- Failing to consider external factors: Tests should be run in comparable periods to produce meaningful results. It is wrong to compare website traffic on the days when it gets the highest traffic to the days when it witnesses the lowest traffic because of external factors such as sale, holidays, and so on. Because the comparison here is not made between likes, the chances of reaching an insignificant conclusion increases.
Use VWO’s A/B Test Significance Calculator to know if the results your test achieved were significant or not.
- Wrong tool: With A/B testing gaining popularity, multiple low-cost tools have also come up. Not all of these tools are equally good. Some tools drastically slow down your site, while others are not closely integrated with necessary qualitative tools (heatmaps, session recordings, and so on) leading to data deterioration. A/B testing with such faulty tools can risk your test’s success from the start.
Industry Specific A/B Testing
- A/B testing for media and publishing: Some goals of a media and publishing business may be to increase readership and audience, to increase time spent on the website by visitors, or to boost articles and other content pieces with social sharing and so on.You may try testing variations of email sign-up modals, recommended content, social sharing buttons, highlighting subscription offers, and other promotional options.
- A/B testing for ecommerce: Improve conversions through A/B testing by increasing the average order value, optimizing your checkout funnel, reducing cart abandonment rate, and so on. You may try testing: the way shipping cost is displayed and where, if and how free shipping feature is highlighted, text and colour tweaks on the payment page or checkout page, the visibility of reviews or rating, etc.
Showpo, an online pure play women’s fashion company A/B tested its shipping strip along with other elements to create a better customer experience, and in the process, increased its revenue by 6.09%. Read more here.
- A/B testing for travel: Increase the number of successful bookings on your website or mobile app, your revenue from ancillary purchases, and much more through A/B testing. You may try testing your home page search modals, search results page, ancillary product presentation, your checkout progress bar, and so on.
Djoser, a Dutch travel agency increased their bookings by 33% through dedicated A/B testing. Read more here.
- A/B testing for B2B tech: Generate high-quality leads for your sales team, increase the number of free trial requests, attract your target buyers, and perform other such actions by testing and polishing important elements of your demand generation engine.To provide the best user experience and to improve conversions, you may try testing your lead form components, free trial sign-up flow, home page messaging, CTA text, social proof on the home page, and so on.
Learn how POSist, a leading SaaS-based restaurant management platform with more than 5,000 customers at over 100 locations across 6 countries, increased their demo requests by following a continuous and iterative testing approach to A/B testing.Read more here.