Talk to a sales representative

+1 844-822-8378
or

Write to us

Check Your Email!
We've sent a message to
yourmail@domain.com
with an activation link in it.
Just click the link, and we'll take it from there.
Can't find the mail?
Check your spam, junk or secondary inboxes.
Still can't find it? Let us know at support@vwo.com

VWO BLOG

on Conversion Rate Optimization

(This is a guest post authored by Danny de Vries, Senior CRO Consultant with Traffic4U)

Every year, Conversion Optimizers around the world vie for the annual WhichTestWon Online Testing Awards, which are awarded by an independent organization situated in the USA. Anyone can enter the competition by submitting their A/B and multivariate test cases which are then reviewed and judged on multiple factors. The most interesting and inspiring cases are then eligible to win either a Gold, Silver or Bronze badge across a range of categories.

This year, twelve out of the thirty test case winners of the 6th annual international WhichTestWon Online Testing Awards are Dutch. With one golden award, two silver awards and an honorable mention, Traffic4u emerged as one of the strong pillars of the Dutch Optimization prowess. This blog covers our three award winning A/B test cases, starting with the golden award winner.

De Hypothekers Associatie: Users Need Guidance

The test case of De Hypothekers Associatie, the biggest independent mortgage consultancy service in the Netherlands, received a golden award in the category ‘Form Elements’. As a consultancy firm, they rely on advising clients about mortgages and other related financial decisions. However, before contacting a consultancy, users typically want to understand for themselves what their financial possibilities are regarding mortgages and buying of properties. So, a user who’s just begun exploring options is unlikely to contact De Hypothekers Associatie or check for an appointment.

Case Situation

In order to empower users to research the possibilities regarding mortgages, De Hypothekers Associatie created several pages on which users could calculate their maximum mortgage loan, monthly mortgage payments, etc. The experiment included the control page shown below, on which users could calculate their mortgage loan:

Translated Version Control

Hypothesis

Previous A/B tests on the website of De Hypotheker Associatie clearly showed the need for guidance on the website, which was the result of testing call-to-action buttons. For instance, a button that said ‘Next step’ significantly outperformed other CTAs with copy like ‘Contact us’ and ‘Advise me’. This result implied two things:

  • Users want information in small digestible chunks
  • Users like to explore what lies ahead instead of being plainly told what the next step is

The follow-up action was to apply this insight to the calculation page, as the lack of guidance could potentially result in fewer mortgage appointments and paying clients.

The hypothesis was that users need to be guided through the process of calculating the maximum loan amount they could receive. The test variation of the “Loan calculation page” included a clear step-by-step flow guiding users through the calculation process. This was in stark contrast with the control situation that had a more simplistic flow. It was assumed that guiding users through the calculation process would lead to more calculations and hence, more appointments for the consultancy. The screenshot of the variant can be found below.

De Hypotheker Associatie - Variation for A/B test

Results

Guiding customers through the loan-calculation process resulted in a significant uplift of more than 18% in terms of number of loan calculations on that particular page. Furthermore, the number of mortgage appointments also increased by more than 18%.

Why Do Users Need Guidance?

It goes without saying that mortgages are boring and complex. But it becomes a necessity when you are (or want to be) a home owner. Also, taking out a mortgage is a high stakes financial decision that isn’t typically made in a day without sufficient information. Because of this, people need advice on where to begin, what steps to undertake, what the possibilities are and what options suit their situation best. The test results show that including clear guidance on the steps to follow can result in a statistically significant uplift in conversion.

Fietsvoordeelshop: Display Customer Savings Prominently

In the category ‘Copy Test’, the A/B test of Fietsvoordeelshop received a Silver Award. Fietsvoordeelshop is one of the leading bike web-shops in the Netherlands offering an assortment of bikes from top brands for discounted prices.

Case Situation

The website lacked a prominently visible indication of the actual discount users would get on the different products. Discounts were displayed in an orange text right next to the big orange CTA button.

Control Image - for A/B Test

Hypothesis

It was hypothesized that Fietsvoordeelshop was losing potential sales by not showing customer savings very effectively. We expected an increase in click-through-rate to the shopping cart by making the savings prominently visible. The discount, which was shown in orange text Uw voordeel: €550,00, was changed to a more visible green badge that contrasted with the orange CTA button (here’s more on the importance of contrast in design). See the variant below:

Variation Image - for A/B Test

Results

Results showed that the variation outperformed the control with 26.3% statistically significant uplift in Shopping Cart entries. So it’s one thing to offer discounts on products, but unless the benefit clearly stands out, users are likely to miss it and never convert.

Follow-through and Stay Consistent

Although we found an increase in click-throughs to the shopping cart, we didn’t see this effect (or somewhat similar) in the checkout steps following the shopping cart entry. The reason for this could be that the discount badge was only shown on the pages before ‘add to shopping cart’ and not on the subsequent check out pages. In order to sustain the positive influence, it might be a good idea to retain the badge all the way through the checkout. However, it has to be tested if repeatedly showing the savings during the final steps in the checkout process leads to an increase in actual sales.

Omoda: Icons Perform Better (on mobile devices)

The second Silver Award Winning test case belongs to the Dutch shoe retailer Omoda. It came in second in the category ‘Images & Icons’. Omoda is one of the top shoe retailers in The Netherlands offering a range of shoes from world-class brands for women, men and kids. The case serves to show how important it is to segment your test results. Read more about visitor segmentation and how it can help increase website conversions.

Case Situation

Each of the Omoda product pages feature their unique selling points. While these were placed near the call-to-action Plaats in shopping bag and were definitely visible, we believed they weren’t visible enough. The Reasons?

  • The USPs appeared in a bulleted list, but it blended too well with other text on the page and did not command attention.
  • The page also included a big black area for customer service elements. Because the page was largely white, the black areas would get more attention, distracting users from the primary goal of the page – viewing shoe details and adding the product to the shopping bag.

Below is an image of the control version:

Omoda Control for Multivariate Test

 

Hypothesis

The hypothesis was that addressing both these issues to make the USPs more visible would lead to an increase in sales. We created a Multivariate test which allowed us to test both assumptions – USPs aren’t visible enough and the black area is too distracting. All variations are shown below:

Combination 2

Omoda - Combination 2 for Multivariate Test Combination 2: changing the black color to a more neutral grey and moving the customer review rating to the top of the box

Combination 3

Omoda Combination 3 for Multivariate Test Combination 3: using icons and black text instead of grey text to let the USPs stand out better

Combination 4

Omoda Combination 4 for Multivariate Test Combination 4: using elements from combination-2 and combination-3

Results

Overall results for this test told us that the hypothesis should be rejected; there was no convincing proof that any combination would perform significantly better or worse than the control situation. But, through segmentation we found that the hypothesis did work positively on mobile devices and resulted in a whopping 13.6% uplift in sales. Initially, the overall results seemed inconclusive because of a 5.2% drop in sales on desktop and tablet devices.

Users Behave Differently on Different Devices

The results of this test show the device-dependency of hypotheses and the effectiveness of using icons to make USPs stand out better. On the basis of this test, we recommend that you always segment test results to observe the effect of the hypothesis through different dimensions and not make blind decisions.

In the light of previous A/B tests, we believe that the reason why icons perform better on mobile is because desktop and tablet users are more likely to click on the prominent USPs — like terms of payment or delivery — in order to see more details. But, since the USPs aren’t clickable, desktop users would not able to get any additional information. This could irk potential buyers and get them to bounce away. On a mobile device however, with less screen real-estate and the device being less suited to opening multiple tabs, users are less likely to search for additional information.

Understand What Drives Your Visitors And Keep Testing

The above cases have one thing in common. No, it’s not the awards. The commonality is that in each of these cases, we were able to successfully ‘assume’ what drove website visitors. Research using data and/or user feedback told us that a certain effect was occurring. We put this understanding in the required perspective (depending on the type of website and/or product, device, seasonality, user flow etc.) and made certain assumptions about the possible causes for these effects. Then we used A/B and multivariate testing to check if our assumptions were correct. Testing, in fact, is all about learning from your website visitors.

(3) Comments

Leave a Comment
  1. Great post, especially the device segmentation point!

  2. Great case studies and break down, congrats on your awards!

  3. I always follow amazon when I am recommending to my eCommerce clients. This is also a very good tutorial on A/B testing.

Leave A Comment

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes : <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

Close