Adding Customer Reviews Increased Revenue by 7.5%
Disclaimer: The winner of this case study hasn’t been revealed in the post below. Please watch out for an update on the same space on October 16, 2014.
This post has been updated. Please scroll down to see which version won.
Czc.cz is a leading computers and electronics online store in the Czech Republic. They have a wide range of products from mobile phones, laptops, gaming devices to electronics and IT specialities.
To encourage first time visitors to buy from their website, they decided to test adding ratings from Heureka, which is one of the Czech Republic’s most popular price comparison site. This site-wide test was run on all product pages.
The good part about adding the Heureka widget is that it runs a script which shows real time customer reviews, ratings and statistics to visitors. Because of the initial hesitation of sharing their customer details with another site, they decided to test the badge only on 50% of their traffic.
Here’s how the original product page looked like:
Besides testing the impact of the Heureka badge on the product pages, Czc also wanted to place the badge strategically on the page for maximum impact. So Tomas from Czc created four versions of the product page and tested them against the original.
Here are the four variations that were tested:
Version 1: Heureka badge along with ratings just below the add to cart button
Version 2: Only the badge just below the add to cart button
Version 3: Slide-in ratings on the right side
Version 4: Slide-in ratings on the left side
If you hover over the ratings sidebar, it expands to show details of ratings and reviews like this:
More than 90,000 visitors became a part of the test with revenue tracking being the primary goal.
Varition 4, with slide-in ratings on the left, emerged as the winner and recorded an increase of 7.5% in revenue with 95% statistical significance.
Why Variation 4 Won?
We got a lot of responses on different platforms and most of them favored version 1 as the winner. Let’s look at each of the versions again in the order in which they increased conversions during the test.
Because of the F-shaped reading pattern of the web, it is quite likely that the slide-in ratings on the right side were easily missed in visitors’ eye path. Data also confirmed the same, there was no improvement in conversion rate for version three.
This version had only the Heureka badge and no reviews or ratings. Chances are, visitors unfamiliar with it would just click on it to find out more. And once they are on a new website, it’s easy to get distracted by competitors’ reviews, considering other purchase options, which would have significantly decreased the motivation level of prospects.
Version 1 was vouched by many as the winning variation. The first time we looked at the test data internally, all of us unanimously voted for it too. But even version one wasn’t the winner for this test. Now when I think about it, a few reasons that probably explain why this variant also turned out to be a loser are:
1) Czc.cz recorded the number of times people clicked on the Heureka badge for each variation. Considerable number of people clicked on the circular badge in both versions 1 and 2; while no one clicked on it in versions 3 and 4, where the badge was displayed within the slide-in ratings.
Thus, the badge might have acted as a distraction and caused people to clickthrough to the Heureka site.
2) While making a purchase, people tend to look at many options within their budget. This means they look at a lot of product pages before the actual purchase. So the ratings and reviews on every product page right next to the “add to cart” button can increase the likelihood of the badge being clicked to verify the claims on it.
Variation 4 emerged as a winner as the prominent review bar in yellow on the left hand side did only the job it was required to — let people know this website can be trusted. One hover over it displayed several ratings and numbers that gave a great boost to site’s credibility.
Plus, the placement of the widget on the left didn’t bother with the customer experience and yet remained very noticeable by falling into their F-shaped reading pattern of the page.
I am really thankful to all those who guessed the winning version and shared some great insights.
This test was an interesting one for many reasons. Had the company tested only version 3 — with slide-in ratings on the right — against the control, they would have wrongly concluded that reviews don’t work on their website. And had they tested any one of variations 1, 2 or 4 against the control, they would still have got a winner.
Hence, this case study proves that the scope of testing is unlimited. And the benefits you can reap out of a single test can be astonishing. I am wondering if they can test another variation, no.5 may be, which would lead to even more increase in revenue.
Let me know if you have any ideas. I will pass them on to czc.cz! 🙂