BLOG

on Conversion Rate Optimization

A/B Testing for Beginners: Creating a Strong Hypothesis that Gets Results

Knowing how to make an A/B test is not enough. As Peep Laja of ConversionXL rightly points out:

Success of your tests depend on how viable your hypotheses are. A strong A/B testing hypothesis is the only way you can maximize your odds of hitting a glorious win. But what exactly is a ‘strong hypothesis’? How does it increase chances of a win? And how do experts derive such hypotheses? These are all the questions I’ll be addressing in this article.

But before we dig deeper into this… Let’s cover the basics in brief…

So a hypothesis is essentially a change and effect statement that often follows a simple established syntax:

Changing (element tested) from _____ to _____ will increase/decrease (a conversion metric).

This statement is only a theory that can be proved or disproved. It mainly documents how you expect a change made on a website/web page to increase/decrease a conversion metric. Remember that it’s important that the impact of your change must be measured in quantifiable terms. Here’s one good hypothesis statement, for example:

Changing the headline from ‘Grab your tickets now!’ to ‘Tickets filling out soon – only last 50 left!’ will increase ticket sales online.

Only because you follow the above syntax to formulate a hypothesis doesn’t mean that you’ve got the winning hypothesis.

Formulating a Winning Hypothesis

Some essential elements that make a solid hypothesis are:

1. They aim to alter customer behavior, either positively or negatively.

A psychological principle often forms the basis of this hypothesis that triggers a reaction from people. It changes the way people perceive your offer/brand, which alters their behavior during the test. Like in the headline example above, the urgency of the new message is the reason why the variation headline is expected to perform better than the original headline.

If you haven’t already, I highly recommend that you read Robert Cialdini’s book, Influence, to acquaint yourself with six psychological principles of persuasion.

2. They focus on deriving customer learning from tests.

When pros develop a hypothesis, their focus is on the big picture. Dr. Flint McGlaughlin reveals their secret sauce:

Dr. Flint McGlaughlin quote

So your test might give you any result, what’s important is for you to analyze ‘why’ your visitors behaved in a certain way?

As data-driven marketers, it might seem difficult to sometimes make your peace with negative lifts. But if the test reveals a significant customer learning, it can pave way for huge lifts in future.

For example, when Michael Aagaard, the conversion copywriter of Content Verve conducted a series of tests on his client’s website, he learnt that even when the word ‘spam’ (with negative connotation) is used in a positive context, it still reduces conversions for his client.

Negative lift

But when he framed the privacy statement with a more reassuring word ‘guarantee,’ his conversions increased.

Failed test gave way to a positive lift

You can read the complete case study here.

3. They are derived from at least some evidence.

Many avenues can be explored to find good hypotheses. The image below by the popular Conversion Expert, Craig Sullivan, shows several ways in which you can collect meaningful insights.

Getting the right insights for your hypotheses

a. Usability Testing

In simple words, you can sit and observe how your test participants use your website. Take down notes where they get stuck or confused. Form your hypotheses to improve these situations and let A/B tests decide if they work for you.

Usability testing

This method gives you exceptional insights about your customers’ workarounds and struggles in using your website. Using a neutral lead language is crucial for this method to work effectively. Framing a question wrongly can subconsciously sway responses you get from test participants. And if they are not responding with brutal honesty, it will cost you conversions.

We’ve been working on this website for the past six months, can you tell me the positive and negative points of this website?

Can be framed as…

Remember this activity is not to score how good you are, but how good our website is. Do what feels natural and speak your mind aloud. If you get stuck anywhere, it’s okay. Just say what you feel.

The first question can make the participant more conscious and hesitant to express themselves as they might feel that saying anything negative might hurt your feelings. Write down the exact questions you would like to ask during your research.

Asking questions related to homepage, checkout, pricing page, and navigation reveal great insights. Some sample questions you can ask are:

Was it was easy to find what you were looking for?
Were the words/vocab used to define categories/sub-categories clear to you?
Do you have any suggestions to improve our website navigation?
Does our website look credible to you?
Is our pricing clear?
Is there anything else you’d like to know before signing up with us?
Will you shop with us again? Why/why not?
Do you think the form has any confusing/unnecessary input fields?

Present your tasks in the form of situations. Instead of saying directly ‘Track your order on the website,’ give them a real situation as your customers would face.

‘It has been 10 days since you placed your order and you haven’t yet heard from the company. What would you do?

Say if you want to check the visibility of your order tracking feature on the homepage, the latter question posed in the form of a situation will give you a much accurate result.

For more tips on achieving accurate results from usability testing, you should read this article. If you’re considering remote usability testing, UserTesting.com is a good choice.

b. Customer Surveys

When you’re looking for the scope of improvement to find your next hypothesis, iron-clad yourself with an armor and get ready to take criticism head on, like Shaun Hildner of 37 Signals (now Basecamp):

Worried that your little ego might get bruised? Okay, take a softer approach and conduct on-site surveys. You can ask questions to understand your customer segments, to know customers intent/pain points or concerns, glitches in the websites, and so on. Here are some sample questions to use for on-site surveys:

On-site survey questions

Visit to the pricing page shows the intent to buy. See how Qualaroo leverages targeted traffic on their pricing page to understand customer pain points:

Qualaroo pricing page survey

Ask these survey questions during cancellation of a subscription:

Cancellation survey

To existing customers, you can ask:

Have you ever criticized/praised us to someone in the past few months? What did you say?
If you are made the owner of [insert your company name], what would you change?

For those who have just signed up with you, you can send an auto-generated mail asking:

Did you have any doubts or concerns about our product/service before you signed up?
What made you overcome those doubts?
How did you hear about us?

Too many questions can be annoying, especially in on-site surveys. Make sure you respect your prospects’ choice if they choose not to answer your questions.

Qualaroo, Web Engage, and Survey Monkey are some great survey tools you can choose from.

c. Heatmaps

The data in your testing tool can tell you what your visitors are doing but not why they are doing it. Heatmaps can help you identify interest areas of your prospects as well as what areas they choose to ignore. Sometimes this can help you identify great insights when an important page element is going unnoticed by visitors because some other element on the page is stealing its thunder.

The heatmap below shows how a non-clickable element in the image takes away all the attention from the call-to-action (CTA) on the page:

Dead weight spotted in a heatmap

Later when this element was removed, the heatmap shows a clear emphasis on the call-to-action of the page (as it should be):

Heatmap showing CTA getting the much deserved attention

Without referring to their heatmap, TechWyse had never been able to understand the reason for their dropping conversion numbers. You can read their complete case study here.

If not this, heatmaps can sometimes also help you figure out a page element/navigation item that maybe taking too much of important online real estate, while being completely ignored by users. You can consider replacing it with something more relevant to your conversion goal and conduct an A/B test to see how it performs.

Just so you know, you can generate heatmaps of your website in VWO at no extra cost. Not a VWO customer yet? Sign up for our free trial here.

d. Google Analytics

You might know this already but exploring data in your GA can give you some great ideas to get started.

Checking maximum drop-offs in your funnel to find problematic pages is conversion 101. You can also get started with landing pages that receive good traffic but have high bounce/exit rate.

Dig into your in-site search by following this path: Behavior > Site Search > Search Terms

This will give you many insights:

  • Most popular products on your website (You can try optimizing their individual product pages, place them above in navigation and category pages for easy visibility and conversions)
  • Products that people are looking for but that might not be available on your website (You can consider adding them to your site)
  • Words/vocab used by people to find products (You can use those same words on your product page for better conversions)

Test with Confidence

Quality of your insights and knowledge of how you can use them is what sets solid hypotheses apart from random testing.

When you hear data points like ‘Only 1 out of 8 A/B tests drives significant change’ (by Appsumo), it’s okay to feel skeptical. But when you have done your part of grunt work, there’s nothing to be afraid of. Document all your insights and how you can interpret them into design, copywriting or UX fixes. Psychological principles can also take you a long way.

After all this hard work, you might find that you have multiple hypotheses to try. But instead of running in all directions, prioritize your tests and monitor them for both conversions and learning.

Image Credit:
Conversion Rate Experts
KISSmetrics Slideshare
Craig Sullivan’s Slideshare

Comment (1)

Leave a Comment

Leave Comment

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes : <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

Contact Us / Login

Product
Resources Home