What works better: a video or an image slider? A/B test finds 30% increase in signups
Video always has interesting effects on conversion rate or sales on a website. In a lot of cases, video helps in explaining the service or the product, and hence increases sales. In other cases, it may actually reduce conversions (probably because it is not a good fit for that website, or that video is poorly made or presented). Previously, we had featured several interesting case studies about videos. One of them was about A/B testing a video on a landing page, and how it increased conversions by 46%. The other one was about testing ‘Watch a Video’ v/s ‘Get Instant Access’.
The only real way to know what a video will do for your website (or landing page) is to actually conduct an A/B test. And that’s what one of our customers Device Magic did on their homepage. They tested existing version of their homepage (which had a video) against a new version which had a jQuery based image slider instead.
A/B test (video v/s image slider)
Device Magic Mobile Forms allows organizations to quickly and easily build robust data collection systems. Using their API, developers can build rich integrations featuring push technologies e.g. for work orders, proof of delivery, maintenance reports, retail surveys etc. They wanted to see if a video or series of rotating jQuery slides would work best for driving people to signup, so they conducted a simple A/B test using Visual Website Optimizer.
Actually, they weren’t sure that the video was of sufficient quality and they suspected it was too technical. So, they thought maybe slides would present their offering more concisely.
Can you guess which version worked better?
Results and Lessons
They measured two goals: % conversion from home page to sign up page and lastly number of signups completed. Initially their control (with video) was beating variation (with image slider) and they couldn’t understand why. But then they let the test run a while longer until they had a lot more data – much to their surprise, the result inverted to variation outperforming control, which is what they had expected. And this result was statistically significant.
The image slider variation increased conversions from homepage to signup page by 35%, and the total increase in subsequent signups was 31%. So, it was indeed true that for Device Magic, image slider worked much better as compared to the video.
A key learning from this test would be to get enough data that one is absolutely sure of the result, and that one shouldn’t be too hasty because some detail may be skewing the result or leading one in the wrong direction. Patience in A/B testing is the key learning here. (We have an A/B test duration calculator online to tell you how long you should run a test before giving up on it).
If you have had similar (or contrasting) experience testing video v/s image sliders, do let us know in the comments below.