Email marketing is a powerful way to teach both new and returning customers on a regular basis, while keeping them excited about your products and services. However, an email is pointless if it doesn’t entice people to click through to your website and make a purchase. If your click-through rates (CTRs) are low or stagnant, that’s a good sign that you need to run one or more A/B tests so you can effectively update your email marketing campaign.
What can you A/B-test for? The possibilities are limited only by the extent of your creativity. But here are several common tests you should start with:
Even though it’s not the first thing people see, the content of the email is arguably the most important element. Without compelling, easy-to-understand content, people won’t click.
Related to content, you could test for type (e.g., images vs text), layout, or order. For instance, a food blogger could test whether leading with the barbeque chicken dip, or leading with the red, white, and blue yogurt parfait would result in a higher CTR to the “Summer Eats” post.
Content also includes the different elements of an email. The goal of any A/B test is to get more clicks, so why not test the element that users are actually supposed to click on? For example, you might assume that buttons are automatically better than links because they look cleaner or are more distinct. But a button’s success depends on the company and type of content.
- In fact, sometimes links work much better, especially foremail newsletters. This is particularly true for Sitepoint, a company that provides HTML and other coding education services.
- But other times, high-contrast buttons are more powerful. For Campaign Monitor, CTR increased a whopping 127%; one of their email tweaks involved replacing their links with email buttons.
Content is perhaps the most versatile category you can test for. Be bold and think outside the box when designing A/B tests for content.
#2. Subject Line
As the first element a potential customer sees, subject lines are important. As reported by Convince and Convert, 35% of email recipients decide whether to discard an email based exclusively on the subject line.
Here are some aspects of subject lines you can A/B-test for:
- Length. A recent study by Return Path showed that the optimal length of an email subject line is 61-70 characters. Your customers might respond to subjects that are slightly longer or shorter. The only way to know is to test.
- Word Choice. Words matter. An A/B test can help you pick between two similar options. For example, “cash back” has been more effective wording than “discount” for some companies. Will the word “offer” or “sale” speak more to your customer base? Test to find out.
- Word Order. You need to pay attention to word order as well, especially if your company runs sales often. You can test whether stating the product first, or stating the discount first, elicits a better CTR. For example, “Ruby red slippers, 40% off” might result in a CTR significantly higher or lower than “40% off ruby red slippers.”
- Subject Line Content. When your email contains various pieces of content (like different articles), the subject line shouldn’t necessarily list all of them. Instead, you should run an A/B test to see which piece of content your subject line should highlight. Buzzfeed, for example, regularly fine-tunes their subject-line content to determine which produces a better CTR. A recent one involved deciding between “Here’s What Healthy People Eat for Breakfast” and “23 Easy Picnic Recipes that Everyone Will Love.”
Looking for a good email marketing platform to boost conversion and ROI? Check out our list of the five best email marketing services for A/B testing. We tried all the options and bring you only the ones that work!
Copy refers to the text used in the email. It’s not hard to understand why there are so many different types of A/B tests for copy. Here are a few important ones:
- Word Choice. If you’re wondering how your audience is reacting to a specific word, phrase, or sentence, change it up and find out which version results in a better CTR.
- Length. Copy length can be tested alongside images to determine whether subscribers react better to versions of emails with more or less text before images. Because our society is so mobile-driven, emails with less copy sometimes get a better response. The cosmetics company La Mer got a positive response with less copy.
- Organization. The order you say things will either entice or bore your readers. Regularly test the order of your copy to see what your customers respond to more. AwayFind increased trial sign-ups by 33% after A/B-testing their copy and its layout. Their winning version had a shorter headline and a subheading with key features bolded.
Some businesses see a higher percentage of CTRs when they use image-driven content instead of copy. But of course, the only way to be sure for your company is by testing. As a general tip, keep campaign imagery consistent across all platforms.
Here are a few examples of image-related A/B tests:
- Image vs copy driven content
- Image vs image
- Number of images used
- Content of images
- Color of images
- Size of images
It’s important to remember that adding or changing an image isn’t always the solution; sometimes, an image just needs to be taken away.
As another example from Sitepoint, CTRs decreased when they added an image to their newsletter:
The image seems to have distracted a lot of readers from clicking.
Other Important Elements
Numerous other elements of email campaigns have been shown to increase CTRs. Here are some that you may want to test:
- Personalization. Overall, personalization of elements like subject lines can increase your CTR by as much as 14%. You can also personalize copy and the content of your emails.
- Calls to action. Calls to action are important, as urge users to literally click. So, where you put them in an email, the words you use, and your tone are important elements to test.
- Font styles. Is your company edgy? Inspiring? Direct? What types of products do you sell? All of this may affect the font style your audience responds to best.
Evaluating Your A/B-Test Results
An A/B test is considered successful if it’s statistically significant. In a statistical significance equation, the “p” value is the probability that random chance explains your results. As a general rule, if your “p” value is at least five percent, your results are considered statistically significant.
As far as sample size, the larger the better. Out of 10000 users, 500 have to click on one email over another for results to be statistically significant. So, that means you’d have to test at least 1000 people (10%), but a sample size closer to 4000 (40%) is recommended. Keep in mind, however, that if you test too many people, you’re potentially wasting the test by sending too many users a version of an email with a low CTR.
If you’re not a math person, don’t worry. Here’s a list of thebest email marketing services of 2018. Many email marketing services do the math for you. Popular ones like MailChimp and Active Campaign have features that will help you set up and perform A/B tests. That should reduce some of the overwhelm.
When (and if) the email marketing service gathers statistically significant proof that one version is performing better than another, they will automatically begin sending the “winning” version to the rest of your subscribers.
Any increase in your CTR is success. But in order to grow and stay competitive, you need to know how your CTRs compare with your competitors’ CTRs. Just do the research. Constant Contact, for example, published a chart that compares the CTRs of numerous business types against the overall industry average of 7.77%. Here are some examples:
- Automotive Services: 8.83%
- Health & Social Services: 8.27%
- Insurance: 8.21%
- Legal Services: 7.05%
- Real Estate: 5.76%
- Retail: 7.75%
- Technology: 5.84%
- Travel and Tourism: 7.73%
Make A/B Tests Work for You
Content, subject lines, copy, images, and the various other elements discussed here should make for a great starting point for your A/B tests. Begin with a hypothesis and test for the element you suspect is responsible for your email campaign’s low CTR. Remember, there is no limit to the email elements you can A/B-test for. You know your company’s needs best. So even if you think of a test you’ve never seen done, do it anyway!
Aussie Mortgage Broking: https://unbounce.com/a-b-testing/shocking-results/
Convince and Convert: https://www.convinceandconvert.com/convince-convert/15-email-statistics-that-are-shaping-the-future/
Return Path: https://www.campaignmonitor.com/blog/email-marketing/2015/12/best-email-subject-line-length/
Constant Contact industry CTRs: https://knowledgebase.constantcontact.com/articles/KnowledgeBase/5409-average-industry-rates?lang=en_US
A/B test process diagram: https://www.campaignmonitor.com/resources/guides/ab-test-email-marketing-campaigns/