VWO Logo
VWO Platform

Conversion Rate Optimization Solutions for Revenue Growth

VWO’s end-to-end CRO platform helps brands understand visitor behavior, make observations, build a testing pipeline, execute the tests for statistical results, and engage through new-age channels.

Know more
Complies with:
VWO GDPR Ready Badge
VWO CCPA Ready Badge
VWO G2Crowd Leader Spring Badge
Trustradius badge for VWO

We tweet useful stuff daily

Related content:

Rules-of-thumb for A/B and Multivariate tests

4 Min Read

I recently got interviewed on Unbounce blog as a conversion hero. In the interview, I shared few rules-of-thumb related to A/B and multivariate testing which you may find helpful. I developed these heuristics while observing and advising hundreds of tests created by VWO users. So, in this post I will paraphrase and expand on some of the things I shared in the interview.

A/B or Multivariate, which test methodology to choose?

Three main criteria will help you in choosing the right methodology between A/B testing or Multivariate testing:

  • Traffic on test page: MVT requires lots of traffic to get any significant results
  • Design resources available: MVT requires less number of design resources
  • Objectives of the test: MVT is used for optimizing existing design and A/B test is used for optimizing conversions by testing a completely new design

Quoting from the interview, here is an elaboration on these three factors:

The eligibility criteria for each method is traffic of course. You should not attempt to do MVT if you don’t have enough traffic on the site. But assuming traffic isn’t a constraint, MVT works best when you are hyper-optimizing. That is, when your aim is to squeeze the last drop of conversion rate juice from your existing design. On the other hand, A/B testing should be used if you want to test completely different designs and ideas. Ideally, an organization should do lots of MVT tests followed by a few large A/B tests.

MVT typically requires less design resources as compared to large scale A/B test changes. Moreover, as I said, if the objective is to optimizing existing design MVT (or single element change) is way to go. But if you want to do radical changes on the page (say layout change, theme change, etc.) you will go with A/B testing.

Best methodology to start with?

Undoubtedly, if you are just getting started with testing and conversion rate optimization, you should go ahead with a simple A/B test. Multivariate testing is a complex methodology and it is easy to draw erroneous conclusions. From the interview:

For the starters, I always recommend to start with small-step changes in order to truly appreciate the value of testing. Ideally, they should pick a sweet spot on their page (ideal candidates: call-to-action, headline and image) and optimize that by a simple A/B test. Only once they get the hang of the whole process, they should attempt MVT or large-scale A/B test.

What to test and what not to?

Of course, what you test on a page depends on the specific site and objectives of the test. But if you are looking for some rules-of-thumb on what are the most common elements on a page that can be tested, here are they:

  • The King: Call-to-Action (your main button)
  • The Queen: Headline
  • Others: text copy, images, number of form fields, number of steps in funnel, required vs. optional steps, number of elements on page, amount of text on page, layout (left vs. right kind of tests)

As far as what not to test is concerned, it is best to avoid testing:

  • Pricing: very risky, and potentially illegal. You shouldn’t offer the exact same service/product at different price-points.
  • Trivial elements on site: every element being tested on a page should have a hypothesis on why you are including in the test. For example, without a specific reason, you shouldn’t add page elements (say a footer or header) in the test and expect conversion rate to improve magically! You need to be convinced that a particular site element has high chances of impacting conversion rate.

What kind of surprises can you expect while doing A/B testing?

Technically no winning variation in a test should be seen as a surprise because there had to be a specific hypothesis on why you included it in the test. Nevertheless sometimes one finds that the test results are contrary to what was expected. That is, a variation won hands-down when one such expected it to lose significantly (or vice versa). Here are a few (real-world) examples of such surprises:

A recent test was very surprising – in this test it was found out that removing a secure icon from the page actually increased conversions by 400%. Another surprising result was that by simply adding a human photo on a homepage, conversion rate can be potentially doubled.

One of the test results on our homepage goes really against the standard advice of having a ‘Signup’ button prominently features on homepage. We found that a ‘Signup’ button actually decreased eventual sign-up’s and ‘Watch a short video’ worked much better because after watching the video, visitors were sure of what they are signing up for. (We had a ‘Signup’ button on the video page, by the way).

I hope you liked the interview snippets.
If you want to read the full interview, head over to Conversion Heroes Part 3: Split Testing – An Interview with Paras Chopra.

Paras Chopra
Paras Chopra Founder and Chairman of VWO
More from VWO on A/B Testing
The World Has Moved On From Plain Vanilla A/B Testing. Have You?

The World Has Moved On From Plain Vanilla A/B Testing. Have You?

10 years ago, when our CEO Paras wrote about A/B testing on Smashing Magazine[1], it…

Read More


8 Min Read
A/B testing will get you a promotion

A/B testing will get you a promotion

I was talking to a colleague yesterday and one of his offhand remarks struck a…

Read More
Paras Chopra

Paras Chopra

3 Min Read
How to Create a Strong A/B Testing Hypothesis?

How to Create a Strong A/B Testing Hypothesis?

“That is one small step for a man; one giant leap for mankind.” These were…

Read More
Nida Zehra

Nida Zehra

10 Min Read

Scale your A/B testing and experimentation with VWO.

Start Free Trial Request Demo
Shanaz Khan from VWO

Hi, I am Shanaz from the VWO Research Desk.

Join our community of 10,000+ Marketing, Product & UX Folks today & never miss the latest from the world of experience optimization.

A value for this field is required.

Thank you!

Check your inbox for the confirmation mail