Maximise your email open rates with A/B testing, using Kit (formerly ConvertKit)


Maximise your email open rates with A/B testing

If you're not familiar with the term 'A/B testing' (sometimes called 'split testing'), it's a way of testing different two variations to determine which performs best.

In the email marketing world, it's the comparative process of taking two options for subject lines, finding out which performs more effectively, and using it to send most of your emails.

Here's a simple example - let's imagine you were sending an email to your list announcing an upcoming course that you're launching.

And to keep the numbers easily calculable, we'll say you have 1,000 subscribers.

Subject line #1:

"New course launching - get beta access at a discounted price!"

Subject line #2:

"Want to be a beta tester for my new digital course and get insider knowledge?"

To use A/B testing on this, we'd want a small number of our total list to receive version # and the same number to get version #2 - after a predefined period, we assess which had more opens and send the remaining emails using that subject line.

Example process:

  • We identify the list of 1,000 subscribers we want to email.
  • We send an email with subject line #1 to 150 subscribers (15% of the list)
  • We send an email with subject line #2 to 150 subscribers (15% of the list)
  • The remaining 700 subscribers don't get an email yet.
  • We wait for some time (e.g. 4 hours) and determine which subject line invoked the highest number of email opens.
  • Example outcome:
    • Subject #1 gets 89 opens (59% open rate)
    • Subject #2 gets 108 opens (72% open rate)

In the example above, subject line #2 would win the A/B test and, therefore, be used for the remaining 700 emails.

Using version #2 would predictably deliver a further 504 email opens (72%) from the last 700 emails sent, rather than 413 using version #1 (59%).


How to configure and run A/B testing in Kit

Firstly, I should point out that (at the time of writing) A/B testing in Kit can only be done using 'Broadcast' emails, and NOT with email sequences (maybe this could change in future, who knows?).

So, start by creating a new broadcast email;

In the email editor, now click on the little beaker icon, shown in the image below (it's located just to the right of the subject line text box);

This will pop up the form shown below; enter your subject lines into their respective text fields;

Click continue to then configure how long you want to run the A/B test for; the time can vary from 30 minutes to 4 hours (the default).

Of course, the longer you run the test, the more accurate it's likely to be purely down to more time lapsed normally = more emails get opened.

Use the slider to change the test duration, or type directly into the text box containing the time in minutes.

Once you're happy with the configuration, hit the save button and your email will be good to go.

Once the time specified for the test has lapsed, Kit determines the winner and applies a "Winner" 'badge' to make it clear which subject line won.

Note: It’s worth mentioning that on occasion, the ‘winning’ subject line can change after all emails have been sent; for example, using the test above, option ‘A’ may eventually overtake ‘B’ when calculating open rates.

Once sent, you can check the statistics of the broadcast from the Kit desktop; below is a screenshot of what this example broadcast would look like;

Once you've sent your email, it is possible to cancel the A/B test (click the 3-dots, top-right corner, and choose 'Cancel A/B test', however, just note that if you do, only the initial 30% of subscribers will have received the email.

If you subsequently decide you wish to send it to the remaining 70%, you'll need to temporarily tag the ones that already received it so they don't get it again.

You can also duplicate the test if you wish to carry out a similar A/B test, without having to reconfigure the broadcasts again.


⚡️ Summary

If you want to optimise your open rates, use A/B testing on your subject lines and let Kit take care of it for you.

I wouldn't recommend doing A/B testing (or at least not reading too much into the analytics) if you're sending your email to fewer than 1,000 subscribers as the numbers can easily be skewed on lower numbers.

An exaggerated example would be doing A/B testing on just 100 subscribers; subject A could get 5 opens (i.e. 5 opens from 15 emails sent = 33%) and subject B could get 4 opens (i.e. 4 opens from 15 sent = 27%).

Even though there's a 6% swing in the open rate, it's just one additional email being opened.

Do you have a question?
🩵 I love answering them, hit me up!

West George Street, Glasgow, Scotland G2 1BP
Unsubscribe · Choose your email preferences