Split Testing: How to A/B Test Cold Sales Email Templates
There’s no reason why you shouldn’t be split testing every single cold email you send.
Cold email templates save you the time you’d otherwise spend writing out individual messages by hand. But all templates are not created equal.
How can you tell whether the messages you’re sending are as effective as they could be?
The answer is A/B testing — a process which pits two versions of the same message against each other in order to determine which variation drives more desired outcomes. In the case of cold sales email in particular, A/B testing can help you get:
- More opens
- Link click-throughs
- And conversions
Sounds great, right? Yet, despite the clear benefits, nearly 70% of email marketers surveyed by YesWare don’t use A/B split testing in their campaigns.
YesWare respondents provided three main reasons for not using A/B testing:
- They don’t understand its benefits
- Don’t know what to test
- Or, don’t have the tools to test
Let’s break down each of these issues, in addition to exploring the solutions that will help anyone using cold emails in their sales process to take advantage of this powerful technique.
The Benefits of A/B Split Testing
To be 100% clear, if you’re sending cold sales emails, you can benefit from A/B split testing. Just take a look at the following case studies:
- HubSpot used A/B testing to determine whether their audience responded better to emails that featured the company or a real person as the sender. Sending from a real person won, driving 0.53% more opens, 0.23% more clicks and 131 more leads.
- An A/B test for Money Dashboard found that focusing on their business (with the subject line “Please put us out of our misery”) — rather than on recipients themselves — resulted in a 104.5% increase in opens for inactive subscribers and a 103.3% increase in clicks for active subscribers.
- Wishpond used A/B testing to test the impact of adding “You” to their subject lines in emails intended to boost sign-ups to their VIP demo. Ultimately, they found the subject line “Social Media Stats You Need to Know for 2014” resulted in 11% more opens than “Social Media Stats for 2014.”
While these aren’t exclusively examples of A/B testing on cold sales emails, they don’t need to be. What these — and the hundreds of other case studies published online — demonstrate, is that split testing your sales emails can drive performance gains, no matter what you’re selling or what industry you operate in.
What to Split Test
Having ruled out the argument that A/B split testing isn’t important, let’s look at the most common response to YesWare’s survey: “It’s valuable, but I don’t know what to test.”
Determining what to test in your cold emails can be challenging — but not necessarily for the reasons you expect. Far from having nothing to test, marketers face a seemingly-endless number of options. That can make moving forward more paralyzing than if you had nothing to test at all.
That said, just because you can test everything doesn’t mean you should. Instead, I recommend starting with tests on three key areas:
- Subject line
- Opening line
- Your call-to-action (CTA)
Since your subject line is the first thing prospects see in your inbox, it makes sense to start here, as improvements can translate directly into more opens (and then, potentially, more clicks, replies and conversions).
Review the best practices covered extensively online and choose one to test. For instance, an article on Hubspot suggests that, with “40% of emails being opened on mobile first, we recommend using subject lines with fewer than 50 characters to make sure the people scanning your emails read the entire subject line.”
Don’t overthink it. Test Hubspot’s wisdom for yourself by shortening the subject line of one of your existing templates and A/B split testing the two.
For more ideas to test, check out the following resources:
- This Is How To Write Better Email Subject Lines That Get More Opens
- Best Practices for Email Subject Lines
- Improve Your Open Rates With These 12 Subject Line Tweaks
Depending on your recipients’ inbox arrangements, they may see your opening line as part of a preview before opening your message:
It’s up to you to make this opportunity count. Test different opening line variations, including:
- A question versus a statement
- A mention of a shared connection
- A personalized tip or recommendation
- One that mentions your company versus one that does not
- “Small talk” versus getting down to business
Finally, test the specific action you’re asking prospects to take. While we all know not to ask for the moon on first contact, you may still see a major difference in performance by asking to send a proposal versus asking for a call back.
Only after you’ve tested these three items should you move on to other A/B split tests, which could include measuring the impact of:
- Your use of personalization fields
- The inclusion of imagery in your messages
- Different fonts, font sizes or font colors
- HTML versus plain text formatting
- The length of your messages
- How frequently you email prospects
- Including a “PS” in your message
- When you schedule your messages to be delivered
It’s not that these tests aren’t important. Focus on your big wins first — especially if the limited volume of messages you send makes it difficult to reach statistical significance.
Split Testing Tools
One of the beautiful things about A/B split testing your cold emails is that you don’t need a fancy monitoring system to determine whether or not the changes you’ve made are having an impact.
If your email marketing provider offers tools that measure statistical significance, that’s great. Use them.
[click_to_tweet tweet=”If your email marketing provider offers tools that measure statistical significance, use them.” quote=”If your email marketing provider offers tools that measure statistical significance, use them.”]
But even if you don’t have access to these kinds of measurements, you can still benefit from A/B testing. According to Close.io’s Steli Efti:
“In the early stages of your startup, you will lack the data to make perfect decisions. Overcome this hump by gathering qualitative data and making intuitive decisions based on talking to customers.”
Efti recommends picking up the phone after sending cold sales emails to gather the kinds of qualitative data he mentions in the quote above. Asking what in your message resonated for recipients — if anything — can give you as much valuable insight for improving your campaigns as you’ll gain from running tests to statistical significance.
Ultimately, sending cold email messages to sales prospects is as much an art as it is a science. Combine the strategies described above with gut instinct and feedback from your prospects. Continually test new messages and new templates to find your winning combination.
What other tips do you have for split testing cold sales emails? Share your suggestions by leaving a comment below!