BLOG

on Conversion Rate Optimization

VisitNorway.com’s adventures in the A/B Testing land

This case study was first published by VisitNorway.org, which is one of our customers. They have shared their experience of performing A/B tests on their website. The original blog post was in Norwegian and we translated it into English and are reproducing the same below.

We are constantly striving to improve Visitnorway.com, and we have generally used traditional tools for analytics as well as usability tests and surveys to assess needs of our visitors and accordingly customize the content and functionality.

In January, we began using a new A/B testing tool [Visual Website Optimizer] for quickly and effectively measuring the impact of changes in text, design and functionality.

Based on input from users, experts or hypotheses, when we decided to conduct a test, we set up the variations that we wanted to test against the original. Based on the results of tests, this helped us decide whether we want to implement the variation on website permanently or not.

For example, it is very easy to test the title that gives the most clicks in an article. We just need to click on the title and change it, see the screenshot below:

Results from some of the tests we have conducted

1) Click to destination company websites from landing pages

We wanted to find out about simple design change on landing pages to increase the number of clicks to the destination’s websites. At first, we only increased the size of the link text and converted it into bold.

The test was done on VisitOSLO pages of visitnorway.com. We got a positive increase of 70 – 75% in click through rate simply by changing the size and have stronger text, see the screenshot below.

We therefore proceeded with this change on the landing page and increased visits to destination company website.

2) What should be the “Booking” name called in the top menu?

We tested different names for the menu item “Booking” in English, Norwegian and Spanish, hoping to lead even more of those who were interested in booking a holiday into the booking section .

For English, the text “Booking” won with an improvement of 39% against the “Book Online” as it originally stood. “Online booking” had a 14% improvement to the original.

The Norwegian word for “Order” with an improvement of 114% (!) won against the original “Book travel”. See screenshot below for an overview of the other variants.

In the Spanish menu, we wanted to see if it was best to write the “Booking” in Spanish or English. Since the Book Norway does not yet have Spanish version, we were curious about the bounce rate.

With regards to how many people went to booking section, “Booking” had a decrease of 48% in English (Variation 1) in relation to “Reservas” (Control). See screenshot below.

Bounce rate was similar for both of them, so we eventually used the “Reserve” in the Spanish menu.

Based on these results, we changed the text in the menu to the “winners” of these three tests.

3) Moving the “Order” link in the main menu for the Norwegian edition

We wanted to see how much change there was in the number of clicks to the Book Norway when we moved the “Order” from the second position to the second last and last position in the main menu.

We had a decline of 72-73% in the number of clicks, see the screenshot below.

So as the numbers speak, we did not move not the “Order” link.

Conclusion and Lessons

A/B testing is an important method for us to determine whether it is wise to move forward with new concepts or changes, before we spend a lot of time and money on design and development. We will use it actively to get real decision-making data.

Our findings show that one must be careful of what you call “booking” in the various languages ??- words are tremendously powerful.

Finally, we chose Visual Website Optimizer because it was easy to use and had enough functionality to allow us to carry out the tests that we want.

Happy testing!

Founder and CEO of Wingify.

Comments (8)

Leave a Comment
  1. – you had less than 100 conversions per treatment
    – you had only a few visitors taking part in the test
    – you had confidence level below 95%+
    – you probably ran the tests not long enough looking at the traffic
    I normally would declare these tests you showed us, as not statistically valid, i.e. failed.

  2. Agree with Jan here. The sample size is far too small to declare any of these tests statistically valid.

    VisitNorway – I would be great if you could switch the tests back on, gather more data, and report back on the results!

    1. Right, tests are not statistically significant. So results cannot be trusted, still we found it worthwhile to share what kind of tests VisitNorway.com conducted.

  3. Thank you for all your comments and insights.

    We are new to using VWO or similar tools for A/B testing, so our routines and criterias for evaluating the results is still not settled. We are trying to see what works for our site, and for different type of tests. We are also using Google analytics and Webtrends to analyze the changes in long term, because even though you get a valid test result, market, trends and user behaviour may change over time.

    So for the booking text changes we performed in different languages we also measured:
    Bounce rate, Time spent on site, Pages per visit and Transactions. And I agree we don’t have enough conversions in the tests to verify this. But we still monitor this in our other analytics, and we are constantly re-evaluating the effect of it.

    We currently have these test settings:
    Declare winner: 85% chance of beating original
    Declare looser: 5% chance of beating original
    Minimum visitors before declaring winner: 100

    And as far as I know we don’t have the possibility to set a number of conversions before declaring the winner, because that would have been a better target.

    Jan:
    Do you mean 100 conversions per variation or a total of minimum 100 conversions for the whole test?

    Is it to few visitors for all of the tests? For example the test for the change of “order” in norwegian it had 3260 visitors and the best variation had 14/561 against 6/515 and 95% chance of beating the original. So what is the combination of conversions/visitors this test should have had before you would trust the results?

    I will clone and enable the test for the link to our destination companies websites, since we have not implemeted this yet. Then we can check how the result changes when running for a longer period of time.

  4. Hi Kjetil,

    – at least 100 conversions per variation
    – and at least 2,000 – 3,000 visits per variation (this is not a golden rule, it’s just as we run our tests whenever possible on low traffic pages)
    – minimum 95% confidence level
    Otherwise, if you make a decision on low volume of visitors and conversions, then you could end up making a bad decision costing your company money.
    I use VWO as well. If you are interested, then contact me at jan @ proimpact7 .com and I can show you via web meeting how we run the tests and the things you need to be aware of.
    NOTE: as we speak, you won’t be able to access our site, as it was attacked by hackers who apparently planted some ‘malicious software’ in our blog and consequently Google blocked our site, so you get a warning message when entering the site at the moment. We are fixing it, and hopefully it will be solved in the next 48 hours.

  5. I recommend having a look at this blog post, which deals with the consequences of ending a test soon. Basically, I describe typical problems with tests, when after a few days your testing tool tells you, that your tested variation won and the test is statistically valid.
    In this real test example you will see, that if we ended the test as soon as the testing tool declared the winning variation as a winner, we would report by 731.74% greater increase in conversion, than an actual result. goo.gl/qhD1A.
    The tool I used was Visual Website Optimizer.

  6. Thank you for sharing Jan, I agree that it is important to test over time to get data from all days and timezones(countries). We try to do this by sending a set percentage of our traffic into our tests achieve this.

Leave Comment

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes : <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

Contact Us / Login

Product
Resources Home