If you’ve ever worked to create engaging emails and hit “send” with great expectations — only to get disappointing results, you may be unsure of how to proceed. If you’re plagued by underperforming emails, it’s important to learn whether a certain aspect of the email is affecting engagement. This is where an A/B test comes in. A/B tests, or split tests, are a simple way of digging into what’s working — and what’s not.

What is an A/B test?

Think of an A/B test this way: Piece A is the original version of an email you’ve sent, with all the data you have about how many people read it, how many clicked on a link, and how many engaged in some way. Piece B is that original version of the email with one thing changed. To test, you send the modified, B version of the email to see whether that one change creates more engagement with your subscribers. That’s basically what split testing, or A/B testing, is all about—testing one version of an email against another that you have changed in one way.

Explore your existing email (A).

The key with A/B testing is your willingness to consider making a change. You must be open to looking at that original email and asking yourself, “What might make this more engaging to a subscriber?” For example, you might look at your initial send and ask, “If I moved the graphic up to the top, would people be more likely to click?” Or, “If I changed the email’s subject line and made it a question, would people be more likely to open it?”

Consider your email as different pieces of content. There are many various parts:

  • the subject line
  • your opening paragraph
  • the graphics or lack of graphics
  • the format of the email (one column or two)
  • the font

Any aspect or element is something that you can change, and therefore there are a lot of opportunities to test.

Make one variation.

The key with A/B testing is to change only one thing, one element of the email. Changing one item between the original email and the variation is the only way you will have a true sense of whether your change was a success or failure.

If you change several elements in your email and test it, you will have no idea which changes affected the open rate or click-through rate. The results are likely a combination of a number of changes, and that won’t help you determine, for future sends, what made the difference.

For effective A/B testing, it’s essential to test just one variation at a time.

Things to consider varying per Hubspot:

  • Offers: Experiment with the medium of the offer. You might test an eBook versus a whitepaper or video.
  • Copy: Experiment with the formatting or style of the content. You could test plain paragraphs versus bullet points or a longer block of text vs. a shorter block of text.
  • Email sender: Try sending the email from an employee’s email address instead of a generic department address.
  • Image: Try different photos to see whether the conversion rate is influenced.
  • Subject line: Play around with the length of the subject line or add personalization.

You might also think about changing:

  • Send time: The time and day you sent your original email may not have been optimal. Your audience might prefer a different day of the week or time of day.
  • Email length: Make the email longer or shorter. To do this, consider providing snippets of information, so readers are required to click through to the article to read more.
  • More personalization: While you might have personalized the opening greeting, is there a way to add even more personalization to the email?

It is vital with A/B testing to be patient, and while HubSpot notes that one testing option is to completely change the whole email, other email marketing groups advise against this approach. Bluecore, for example, stresses the importance of only changing one element at a time. Once you have a statistically significant result pointing to the variation that performed better, you can continue optimizing through more minor tweaks.

Experts advise that the only time doing a complete variation change may work as the B test is if you wrote your initial email in an informative and educational way and your second email using a more story-focused method.

Set up your duplicate email template (B).

These days, most groups or businesses use an email marketing platform to send emails, especially as subscriber lists grow. It is the easiest way to send messages to a list or the subgroups of an extensive list. These platforms can usually set up an email template, which makes it much easier to work with A/B testing. All you do is copy the email template you began with, save it as Version B and make the single change you want.

You may even decide you want to create three different versions of your email. In version B, you change the opening paragraph; in Version C, you change the graphics element. You then send out these versions and your original on the next email send days.

Analyze your results.

Once your sends are complete, it’s time to analyze the results.

  • Did version B provide any more engagement than version A?
  • What engagement behavior changed?
  • How do open rates and the click-through rates compare?
  • Which version provided more conversions?
  • Did a version result in a higher-than-normal rate of unsubscribes?

The key to A/B testing is to understand there are many variables at play at any given time. While you might try one test and don’t achieve any significant shift in engagement, the next test might provide extensive results.

Be open to trying numerous A/B tests during the year. Try something for a little while and if it starts to plateau, consider performing another A/B or split test to change things and create a shift in the engagement rate of your readers.