Talk to a sales representative

+1 844-822-8378
or

Write to us

93.71% Increase in Clickthroughs by A/B Testing Pricing Page Redesign

Posted in Case Studies on

Pricing page is one of the high-impact pages of any website. For a SaaS business, it is particularly important because this is where the serious buyers will spend a lot of time.

lyyti-logo-small-greenLyyti.com, a Finland-based event management software company, found that their pricing page wasn’t clearly communicating the product features. Hence, they decided to redesign it completely. But before hardcoding the changes, they ran an A/B test.

The Problem

After running several small A/B tests and having discussions with the Lyyti sales team, Sampsa Vainio, conversion optimization expert, found that the original pricing page was not very user-friendly.

The heatmaps and clickmaps reports indicated that users were frequently moving between the pricing and the features page. The short descriptions provided on the pricing page weren’t making much sense to visitors. The sales team also gave the same feedback — the prospects were not clear about the features offered in various plans.

Here’s the original page:

Lyyti Control

 The Hypothesis

The hypothesis was that prominently displaying the features of all plans and having multiple CTA buttons would increase visits to the free trial signup page and consequently, increase signups.

Here’s what the variation looked like:

Lyyti Variation

The Test

Two completely different variations of the pricing page were created and served evenly to visitors. The test was run for a period of over 5 months.

The Result

The neat new look won and increased visits to the lead generation page by 93.71%  with a statistical significance of 96%.

Here’s a quick comparison of the original and the challenger:

Comparison Image -- Lyyti

Why the Challenger Performed Better?

The variation had many elements of success that were missing in the original. Here’s what worked for them.

1) Features clearly tabulated in the pricing plan

In the new design, the features of each plan were clearly mentioned. And since the features were mentioned right with the pricing, it was easier for the prospects to make a decision. Also, the variation clearly represents the additional features of higher plans — something that was missing in the original design. With the short descriptions of each plan in the old design, it was hard to discriminate between the plans.

2) Multiple CTAs saying the same thing

As compared to the original, the variation had 5 more CTA buttons asking visitors to sign up for a free trial.

In the original design, the message, “Try all the features and experience all the benefits of Lyyti for free” was buried in the plan description and not immediately noticeable. On the other hand, each plan in the variation had a free trial CTA button above and below it. Thus, the message that visitors were offered all the features in free trial period became even more prominent.

I would like to see how removing the upper CTA affects conversions as I feel they are creating clutter.

3) Shifting the focus of the page to one purpose

In the old design, there were four CTA buttons of the same size giving two different messages. In the challenger, there were six CTA buttons all giving out a single message. The other CTA message — ‘Request a Quote’ — was made into a link and moved below the main CTA buttons.

Now the page has just one purpose — to ask visitors to sign up for a free trial.

In Sampsa’s words,

“We soon realized that asking for a quote might be asking too much since our company is not that known outside of Finland (which is our main market at the moment).That’s why we shifted the focus of the main goal of the page —> sign up for a free trial.”

Key Takeaways

1) One page, one purpose. There should be absolutely no element of distraction or point of conflict between choices for visitors. Having multiple goals will confuse visitors and can lead to drop in conversions.

2) Learn about your visitors from all mediums possible. Studying their behavior from your analytics tool, discussions with your sales and support teams or even speaking directly to them will give you great insights.

3) Don’t shy away from making tough changes, even if that means to redesign a page. (See how CrazyEgg redesigned their homepage to improve conversion rate by 363%)

4) Absolutely never, ever, forget to test! (If you need any more motivation to begin A/B testing, we have a 30-day free trial, no commitments, no credit card!)

A Parting Note

The optimization of pricing pages can give an immediate lift to your revenue. As Sampsa explains,

“The LTV (Life Time Value) of a single user who converts into a paid customer is pretty high for us, normally in the thousands of dollars. Also, the value is high since users who purchase a license usually stay with us for several years. All in all, the 94% increase will definitely be seen in our sales and we’ll do our best in converting those free users into paying customers.”

Have you completely overhauled any of your pages? How did it affect your conversions? Let’s take it forward in comments.

Comments (8)

  1. Thanks for posting this. In addition to testing the layout and design of the pricing page, actual price testing can have a dramatic impact on the business.

    Reply
  2. I hope that other people reading this will see the huge mistake that was made here!

    How the hell did this test run for 5 months!!! With such a high increase 90%+ this test should have been over in days not months!

    Reply
  3. @David Shaw

    #1 Would you like elaborate on the “huge mistake”? =)

    #2 Low amount of visitors, we had to run it for a long time in order to get a statistically significant result.

    Reply
  4. David Shaw: “How the hell did this test run for 5 months!!! With such a high increase 90%+ this test should have been over in days not months!”

    According to the test writeup, the new design increased visits to the lead generation page by 94% with 96% statistical confidence. If conversion rates increased by 94% with only 96% confidence, the amount of visitors in the test was probably fairly low. I don’t have any conversion rates to work from, so this is just a guess, but if the conversion rate for the control was 5% and the treatment increased that by 96% to 9.8%, we might only be looking at 350 total visitors (175 visitors -> 9 successes for the control and 175 visitors -> 18 successes for the treatment). If my visitor estimates are even remotely close, I believe they were probably correct to wait 3+ months rather than to pull the trigger in months 1-2.

    Reply
  5. I would be interested to see the visitor numbers for this test.

    If as the previous comment has pointed out the numbers are as low as we believe then this test has no validity at all.

    Are the visit and conversion numbers available for this test?

    Reply
  6. Brian McKenzie (BM) says:

    “If as the previous comment has pointed out the numbers are as low as we believe then this test has no validity at all.” I would not bet my life that the test version would continue to win if the test ran for another year, but I feel that the test is more valid than making a coin flip (i.e. a true guess). That said, the results are not airtight enough for this test to be really exciting. Also, the dependent variable measured in this test is the percentage of viewers that go to the conversion page, NOT the percentage of viewers that actually convert. I think that’s understandable, given that the sample size on the conversion page would probably be VERY small over these 3 months (i.e. 25-50 visitors from this page, I’m guessing). In my experience, I’ve had quite a lot of A/B tests where we’ve been able to increase traffic to a conversion page WITHOUT increasing overall conversions — e.g. increasing traffic from page 1 to page 3 may actually *decrease* overall conversions if page 2 plays a critical role in convincing users to want to convert.

    My Analytics suggestion would be to do a segment comparing the people that converted to the people that went to a conversion page WITHOUT converting compared to the people that did not convert. Generally, I find that successful converts view the most pages, followed by people that made it to the conversion page but did not pull the trigger, followed by people that left without getting as far as the conversion form. If this is the case for you as well, I’d recommend looking into which assisting pages correlate the highest with increased conversion rates (e.g. on various clients our key assisting pages have been promotion pages and city-specific pages). If you have any pages that stand out there, I’d recommend doing an A/B test encouraging traffic to flow towards those pages which have correlated with success.

    Best of luck with your work, everyone!

    Reply
  7. That’s a great observation Brian, and we find similar cases on the VWO website as well. Based on the behavior of certain visitors, we can predict interest in the service and likelihood to sign up for a free trial (we’re still working on figuring out those most likely to convert to paid customers though).

    Reply
  8. Thanks for the comments! Here are the visitor numbers and conversions:

    Conv/visitors

    Control: 10 / 212
    Variation 1: 18 / 197

    I checked Google Analytics while providing information for this case study and we actually saw an +90 % increase even in signups for visitors coming from the variation.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>