How WriteWork.com increased sales by 50% & doubled conversions by A/B testing a radical new design
Sometimes it takes a radical redesign to get big results. ClickLab is a Brazilian agency specialized in Conversion Rate Optimization and they used Visual Website Optimizer to test a radical redesign of the primary page on WriteWork.com, a popular essay website for students.
They chose to focus on the page which receives most entrance traffic. The traffic is nearly exclusively organic and the page has been struggling with high bounce rates for a while. The objective was to increase engagement and get more people further down the funnel.
After crunching user surveys, getting feedback from tools such as UserTesting.com, Fivesecondtest.com, Gazehawk.com and ConceptFeedback.com a radical redesign was developed and recently tested using Visual Website Optimizer.
Targeting A/B test to a specific visitor segment
One challenge was how to go about testing the new design. Pretty much everything was different, from header to footer. This means a user might land on the homepage, which has one look, then click on to the redesigned page, which has a different look.
Luckily this is easily solved in Visual Website Optimizer. Since the primary objective was to increase engagement on the landing page, the following segmentation trick was used:
By restricting the test to only the visitors landing on the page, engagement could be measured without having to worry about the design not being consistent throughout the site.
The original page didn’t communicate what the benefits of the service are. Following is how original design looked like:
After reviewing over 1,000 survey responses it became clear what the real benefits were and then these benefits were communicated much more directly in the variation. Secondly, social proof and various credibility indicators were introduced. Here is how variation looked like:
A/B test Results
The result of the test was that engagement more than doubled. One important thing to note is that engagement in Visual Website Optimizer (VWO) only measures clicks and not submissions of forms, for example using a search form (for this reason searches were measured separately).
The great thing about VWO is that you can measure multiple goals. This meant it’s possible to see exactly where the engagement was happening – the biggest increase was indeed clicks on the CTA. The four top buttons accounted for less 1% in the engagement increase.
The secondary objective was to get people to click on the primary CTA and here results increased by 144%.
This meant that more than twice as many people continued on to the payment page. More payment page views doesn’t automatically more payment, but a follow up test showed that payments went up by over 50%.
Actually, although the results were fantastic, they weren’t that surprised. After going over the surveys from users it became so clear that the original really did a terrible job at selling the service.
Lessons from the A/B test
What allowed them these fantastic results was that they started with the user in mind; doing everything they could to understand him/her. Understand their wants, their worries and their needs. They also looked very closely at the language used on the website. For example, WriteWork has always used the expression “overcome writer’s block”, but no users used these words. Instead users wanted to hear “get started” and “get inspiration”. So now they use that language on the website.
Role of Visual Website Optimizer
Jens Schriver from ClickLab gave Visual Website Optimizer a nice, short testimonial:
It was a breeze to setup this A/B test and segment it. We’ve used Google Website Optimizer many times in the past, but – if we can avoid it – we are not going back 🙂
Hope you liked this case study. Our online case study library has many more A/B split testing case studies. And if you’d like to do a similar A/B test on your website, we have a free 30 day (no obligation) trial of Visual Website Optimizer.