This case study was first published by VisitNorway.org, which is one of our customers. They have shared their experience of performing A/B tests on their website. The original blog post was in Norwegian and we translated it into English and are reproducing the same below.
We are constantly striving to improve Visitnorway.com, and we have generally used traditional tools for analytics as well as usability tests and surveys to assess needs of our visitors and accordingly customize the content and functionality.
In January, we began using a new A/B testing tool [Visual Website Optimizer] for quickly and effectively measuring the impact of changes in text, design and functionality.
Based on input from users, experts or hypotheses, when we decided to conduct a test, we set up the variations that we wanted to test against the original. Based on the results of tests, this helped us decide whether we want to implement the variation on website permanently or not.
For example, it is very easy to test the title that gives the most clicks in an article. We just need to click on the title and change it, see the screenshot below:
Results from some of the tests we have conducted
1) Click to destination company websites from landing pages
We wanted to find out about simple design change on landing pages to increase the number of clicks to the destination’s websites. At first, we only increased the size of the link text and converted it into bold.
The test was done on VisitOSLO pages of visitnorway.com. We got a positive increase of 70 – 75% in click through rate simply by changing the size and have stronger text, see the screenshot below.
We therefore proceeded with this change on the landing page and increased visits to destination company website.
2) What should be the “Booking” name called in the top menu?
We tested different names for the menu item “Booking” in English, Norwegian and Spanish, hoping to lead even more of those who were interested in booking a holiday into the booking section .
For English, the text “Booking” won with an improvement of 39% against the “Book Online” as it originally stood. “Online booking” had a 14% improvement to the original.
The Norwegian word for “Order” with an improvement of 114% (!) won against the original “Book travel”. See screenshot below for an overview of the other variants.
In the Spanish menu, we wanted to see if it was best to write the “Booking” in Spanish or English. Since the Book Norway does not yet have Spanish version, we were curious about the bounce rate.
With regards to how many people went to booking section, “Booking” had a decrease of 48% in English (Variation 1) in relation to “Reservas” (Control). See screenshot below.
Bounce rate was similar for both of them, so we eventually used the “Reserve” in the Spanish menu.
Based on these results, we changed the text in the menu to the “winners” of these three tests.
3) Moving the “Order” link in the main menu for the Norwegian edition
We wanted to see how much change there was in the number of clicks to the Book Norway when we moved the “Order” from the second position to the second last and last position in the main menu.
We had a decline of 72-73% in the number of clicks, see the screenshot below.
So as the numbers speak, we did not move not the “Order” link.
Conclusion and Lessons
A/B testing is an important method for us to determine whether it is wise to move forward with new concepts or changes, before we spend a lot of time and money on design and development. We will use it actively to get real decision-making data.
Our findings show that one must be careful of what you call “booking” in the various languages ??- words are tremendously powerful.
Finally, we chose Visual Website Optimizer because it was easy to use and had enough functionality to allow us to carry out the tests that we want.