5 Advanced Email A/B Testing Best Practices
We know email marketing is effective. According to Copyblogger, email marketing yields an average ROI of 4,300% and is nearly 40X more effective at new customer acquisition than Facebook or Twitter. With 85% of American adults checking their email at least once per day it’s a channel that can’t be ignored.
That said, you aren’t going to see big numbers like that if you aren’t actively testing the performance of your email campaigns. A/B testing is a great tool to help improve your email marketing performance – but only if you know what you’re doing.
Email A/B Testing Basics
A/B testing, as you may already know, involves presenting users with two options in order to see which alternative performs better. In the case of email A/B testing, that might mean sending half of your list one version of an email and the other half a different version, while you watch for changes in your open rate, click-through rate or other KPI.
The best practices described below represent the foundation of an effective A/B testing program. If you’re already familiar with the general structure of A/B testing campaigns, feel free to skip to the next section. Otherwise, make sure you’ve mastered these basics before increasing the complexity of your program.
- Set a control version against which tests can be run. Don’t just pitch two random emails against each other, then start fresh with two new emails. Always have a control version (often, the winner of previous tests) so that you’re working off of baseline performance values.
- Test a single variable at one time. If you change five variables in each email version you send out, you won’t know which of your changes actually contributed to any performance improvements you see.
- Make sure you’ve hit statistical significance before calling out a winner. Statistical significance helps you to determine how likely it is that any lift you’re seeing is the result of changes you’ve made, rather than a random chance. Use a calculator to make sure your results are legit.
Your email marketing solution should offer you A/B testing functionality, but even if it doesn’t, you can create your own testing protocols by manually segmenting lists and creating separate campaigns for each.
Advanced Email A/B Testing
Once you’ve mastered the basics, you’re ready to expand on your campaign’s fundamental elements. Review the following best practices for opportunities to improve your email A/B testing campaigns.
Tip #1: Start with a hypothesis and a desired outcome
If you make changes to an email and find that one variation performs better than another, that’s a start. But if you don’t know what you’re testing for, you can’t know if you have a winner.
Instead, start every campaign by defining what you hope to improve and why you think the changes you’re testing will contribute positively to your desired outcome.
Tip #2: Test high-impact elements
Sure, you might be able to prove that a blue button in your email newsletter gets more clicks than a red one. But does that really matter to your business’s overall performance?
If you’re going through the trouble of setting up an A/B test for your email message, make sure that you’re testing elements – such as the wording of your CTA or the specific offer you make – that have the potential to provide a significant uplift to your business.
Tip #3: Test more than your subject line and body copy
Although these elements represent natural starting points, don’t stop there. Once you feel you’ve gone as far as you can with tests on your subject lines and body copy variations, expand your testing program to encompass the timing of your email automation flows, the actions you use as triggers, or the way you segment your recipients.
Tip #4: Test broadcast, segmented, automated and transactional messages
According to Litmus’ 2018 State of Email Survey, “Nearly 39% of brands never or rarely A/B test their broadcast and segmented emails. More than 65% of brands never or rarely A/B test their automated emails, and 76% never or rarely A/B test their transactional emails.”
That’s a big deal – and it’s a huge amount of money left on the table. Assuming you’ve mastered the basics of testing your broadcast and segmented messages, make sure you’re extending both the practice of A/B testing and of noting any learnings you’ve discovered, to the other types of emails you send.
Tip #5: Consider the potential impact of timing on email performance
Email Monks contributor Kevin George makes an important point: “Email marketing metrics are subjected to volatility based on time period. Comparing your results of the post-holiday slump i.e. January with the results of the pre-holiday rush won’t give you substantial result.”
No matter how excited you are to kick off a new email A/B testing program, be cautious if that means starting around a period of irregular seasonal or industry-specific activity. Reaching incorrect conclusions from abnormal spikes of activity won’t do your future testing any good.
Getting Started with Email A/B Testing
You may already be carrying out A/B tests on your website. If so, it should be an easy transition to start building out testing workflows on your email campaigns.
If you’re totally new to A/B testing, don’t let the more advanced tips above scare you off. Email A/B testing is a necessary part of maximizing the performance of your email marketing campaigns. Get started today, and remember that you can always increase the complexity and sophistication of your programs as you start seeing results.
What other advanced email A/B testing tips would you add to this list? Leave a note below with your suggestions.