1. Website Planet
  2. >
  3. Blog
  4. >
  5. What is an Email Split Test?

What is an Email Split Test?

Bethenny Carl Bethenny CarlEmail Marketing Expert
If you’ve ever found yourself spending tons of time looking at different versions of an email to “figure out” which “you think” will do better with your audience, you’re in serious need of a crash course in email split tests. Split testing not only keeps you from stressing out and wasting valuable time, but it also takes you out of the realm of “feel-marketing” that characterizes so many small companies. Instead, you’ll be among the big leagues, where decision-making based on empirical science is king.

An email split test compares versions (“variants”) of a single email and measures which variant is more effective at inducing those who receive it to take a specific action (“conversion”). Empirically valid split tests use variants that are identical except for a single change to a single element of the email. To be effective at increasing ROI, tests must change email elements known to impact the percentage of recipients that convert (“conversion rate”).

Those are the basics. But the techniques and best practices you can use to get the most out of your split testing go much deeper.

Define Your Audience Before Creating Content Variants

Effective split tests provide actionable data to inform the content and design of the marketing emails you send to your subscriber list. Focusing on the email content variants is intuitive, but a control variable in the content is worthless if you aren’t sending the variants to the same audience. Split-testing best practices actually start with standardizing your testing audience via proper email market segmentation strategies.

There are two reasons for this. First, standardizing your audience allows you to attribute discrepancies in conversion rates to the difference in content. Second, knowing your target audience lets you create variants that test for messaging and design choices relevant to generating ROI.

A basic example of this principle is an electronics dealer looking to sell more mobile phones. Because different aged groups use mobile phones differently, the conversion rates on email variants highlighting any given feature will vary widely based on the recipient’s age. Effective testing will account for this by first segmenting subscribers by age and then using those segments to be sure all email variants contain demographically relevant content.

How Many Variants Should You Test?

Split-test effectiveness is a balance between variants’ specificity and clarity. Two kinds of split tests exist:
  • A/B testing
  • Multivariate testing

A/B Testing: Direct But Limited

An A/B test compares two variants of an email. A good way to conceptualize this is by thinking of it as answering a specific “yes or no” question. For example, you might ask, “Does adding this photo increase conversions?” So you run an A/B test by sending out one email with the image, and one without.

image2 20

A/B tests like this provide very clear data. But the drawback is that they can sometimes provide data that’s not as specific as it could be. For example, the image test above doesn’t take into account where in the email the image is placed — something that according to research by Zapier can also have a big effect on conversion rates.

Multivariate Testing: Thorough But Less Specific

Testing more than two different versions of an email is known as “Multivariate Testing.” Use it to understand which of several messaging or design options will give you the highest conversion rate. To help you out, there are a number of sophisticated email marketing providers that offer multivariate testing services. For example, you can compare as many as eight email variants using MailChimp, or as many as five email variants using GetResponse. To learn more either platform, check out our expert reviews of MailChimp here , or GetResponse here.

The benefit of multivariate tests is that you can change multiple elements of an email at the same time. For example, if you want to test how a section of an email performs with completely different copy and design configurations — fonts, sizes, messaging, placement — run a multivariate test like this:

image4 13

Testing in this way can result in higher ROI than A/B testing because the increased number of combinations you can test gives you a better chance of finding the most effective overall variant. But the drawback is that you’ll know only how the ensemble performs; you won’t know which specific changes are producing what effect. The number of insights you can use for specific elements of other campaigns is limited.

The best practice is to run both A/B and multivariate tests on the same audience. Use multivariate tests to increase immediate ROI by finding the most effective combination of elements to increase conversion rates; use A/B tests to better understand the impact of each specific element change within the whole. You can add specific insights from A/B tests into future multivariate versions.

Small Changes Can Make a Huge and Unexpected Impact

Small changes to your emails can make a big impact on their conversion rates. Testing gives you hard data about the impact of small changes via responses from real customers about what actually increases conversions. Thorough split testing is important both because “intuiting” the impact of the changes across variants is impossible, and because your audience can react to commonly cited “best practices” differently.

Although not an email test, RummyCircle’s search for the best Facebook Ad copy illustrates the nuance and power of split testing. A general “best practice” to decrease the cost per action on Facebook Ads is to prompt users to give their opinion in the comments (because it increases the potential for organic — meaning unpaid — reach). RummyCircle ran an A/B test whereby one variant simply stated what the product was, and the other added a single sentence asking users to share their thoughts about it in the comments.

The version that did not prompt users to comment (Version B below) actually decreased cost per action by 224.7%.

image3 17

There are two takeaways from this example. First, testing is important because it could prove “best practices” are less effective at increasing conversion rates with your audience. Second, even just a single sentence — a mere ten words — can have a huge impact on results. It’s impossible for you to “intuit” that such a small change might produce such a drastically different outcome; testing is required to understand the impact of such a nuanced change on your conversion rate.

Test for What’s Important, and Leave Out the Rest

Email offers an almost infinite number of variables to be tested. New email marketers risk getting overwhelmed with a desire to run countless tests and variants — an approach that leads to more complications and frustration than actionable results.

To avoid this trap, beginners should narrow down their focus to the elements of an email known to have the biggest impact. You can expand the scope of your testing later on as you gain experience and confidence.

Subject Line

The subject line is the text that appears in the inbox preview of your email. Because it’s the first thing your subscribers see, it determines whether they’ll open your email in the first place. Testing for the “open rate” — the percentage of recipients that open the email — of subject line variants is a must.

Subject line variables to test include:
  • Length
  • Tone (Funny/Direct/Casual/Formal)
  • Value proposition language (Free/Get/Buy/Limited time offer)
For example, research by Return Path showed that the number of characters in an email’s subject line can correlate to as much as an 8% change in open rates.

Messaging Copy

Research from Microsoft shows that the average human with a cell phone now has a shorter attention span than a goldfish. Finding the right balance between getting your point across quickly and writing in actual English is not easy. In this environment, testing the length of your copy is required to maximize your conversion rate.

A best practice here is to formulate different ways to express the value contained in your email. For an oversimplified example, email copy explaining this article could be written several ways:
  1. Email split test best practices
  2. How to conduct an email split test
  3. Email split testing best practices to increase conversion rates
  4. Here’s how email split testing can help you increase the conversion rates of your emails
All these variants of the copy communicate the value of this article, but the focus of each is different:
  • Variant 1 gives a short and direct synopsis of the topic
  • Variant 2 communicates the actionable items contained in the article
  • Variant 3 states the business value of the information
  • Variant 4 is the same message as Variant 3, but begins with action language “Here’s how”
Each one of these approaches to the copy would produce a different conversion rate.

Calls to Action (CTAs)

Because CTAs are what actually induce readers to convert, the design of their copy and physical presence in your email will significantly impact your conversion rate.

CTA Button Design

The way a CTA button appears visually on the screen impacts conversions. Even the smallest detail can have a big impact. Things to test include:
  • Color
  • Shape
  • Size
  • Font
  • Placement on Page

CTA Copy

Despite being only a few words, the copy displayed on your CTA also impacts conversion rates. You should test:
  • Length
  • Action verb (See/Test/Try/Use)
  • Value language (Demo/Test/Sample)
For example, referral program manager Friendbuy found that using a smaller CTA button design combined with shorter, action-verb-oriented copy led to a 211% increase in conversions.

Email Split Testing: A Powerful Tool That Must Be Used Correctly

Employing email split testing to guide your email marketing choices will influence your conversion rates. However, remember that great power brings great responsibility. If used improperly, testing will lead you to formulate inaccurate conclusions, ultimately damaging your ROI.

One best practice you can use to avoid this is to never draw conclusions beyond what the test results specifically prove. For example, research by Litmus shows that subject lines impact conversions — but sometimes in an inverse manner to their effect on open rates. In other words, a subject line variant that maximizes open rates means just that; it does not mean the variant also increases conversion rates.

A second best practice is to avoid the temptation to test everything all at once. While it’s true that small changes anywhere in your email can affect conversions, running too many tests at once risks becoming so overloaded with making changes that there’s no consistency in how you communicate with your audience. Take your time and focus on the major aspects that have the greatest impact. Once you settle on a strategy, use it consistently before worrying about smaller changes.

Keep the testing focused on finding how to consistently communicate the value contained in your emails to your audience in the most effective way possible. Then you’ll be well on your way to maximizing the ROI of every message you send.


Sources

https ://zapier.com/learn/email-marketing/ab-testing-email-marketing/
https ://returnpath.com/wp-content/uploads/2015/04/RP-Subject-Line-Report-FINAL.pdf
https ://www.telegraph.co.uk/science/2016/03/12/humans-have-shorter-attention-span-than-goldfish-thanks-to-smart/
https ://blog.hubspot.com/marketing/call-to-action-ab-testing-ht
https ://litmus.com/blog/6-shocking-myths-about-subject-lines
https ://www.designforfounders.com/ab-testing-examples/
https ://unbounce.com/a-b-testing/shocking-results/
https ://www.campaignmonitor.com/resources/guides/ab-test-email-marketing-campaigns/
https ://www.noporkpies.com/blog/design/pros-and-cons-of-ab-testing/

Rate this Article
4.0 Voted by 2 users
You already voted! Undo
This field is required Maximal length of comment is equal 80000 chars Minimal length of comment is equal 10 chars
Any comments?
Reply
View %s replies
View %s reply
Related posts
Show more related posts
We check all user comments within 48 hours to make sure they are from real people like you. We're glad you found this article useful - we would appreciate it if you let more people know about it.
Popup final window
Share this blog post with friends and co-workers right now:

We check all comments within 48 hours to make sure they're from real users like you. In the meantime, you can share your comment with others to let more people know what you think.

Once a month you will receive interesting, insightful tips, tricks, and advice to improve your website performance and reach your digital marketing goals!

So happy you liked it!

Share it with your friends!

Or review us on

272477
50
5000
90028