Cold Email A/B Testing: A Playbook To Increase Opens, Clicks And Replies

The internet is littered with conflicting advice on how to write the perfect cold email, leaving a lot of us unclear on the best way to draft that email.

For example: Which of these cold email subject lines is likely to perform better?

Should you sign off your cold emails with a formal

or should you say something more casual like a

Should you go with a short email subject line or a long one?

Should you send your cold emails at 10 am on a Tuesday or 4 pm on a Thursday?

The reality is that there is no silver bullet. There is no universal answer to helping you succeed in your cold email outreach.

A lot depends on your target persona and a host of other factors.

So who should you trust?

cold email ab test data

The good news is you don’t have to decide based on gut feeling anymore. Just run an A/B test (also known as a split test).

Bring science and metrics to iterate and learn from every cold email you send out.

A Pro Guide on How to Write a Cold Email in 2023

How Should You Go About Implementing an A/B Test for Cold Emails?

When it comes to cold email prospecting, A/B testing or split testing is simply comparing two different versions of your cold email to see which one performs better.

But before you throw on your lab coat and safety goggles and start running experiments, here are a few guidelines to get you started.

What Are You Looking To Improve?

You will have to be clear on what you are looking to improve in your cold emails using A/B testing.

Is it the open rates that you want to focus on or is it the reply rates that you want to push higher? You will need to focus on improving a single metric using the A/B tests.

And now that you know the metric you have to focus on, the rest is testing the levers driving that metric.

1. Pick a Metric to A/B Test

When it comes to cold email prospecting, the primary metrics that matter are:

  1. Email Open Rates
  2. Email Click Rates
  3. Email Reply Rates

These metrics also present a hierarchy/pecking order to approach optimizing cold email campaigns.

If open rates are low, then that needs to be fixed before you try to improve your replies.

On the other hand, if open rates are good, but there are very few replies, then the click rate is a good indicator of whether your email message is well crafted. 

Prospects may not have replied yet or booked a meeting with you. But if they clicked a link, visited a website, checked out your social media profile etc, then it is a good indicator that they are interested in knowing more about what you are doing.

If open rates and click rates are satisfactory then clearly you are able to grab attention and get your prospect’s interest. Now is when you test your Call To Action (CTA) to improve your response rates.

cold email ab testing metrics

2. Identify the Levers That Drive the Metric

Now that you have picked a metric to work on, it gets easier to identify which part of your email you should work on – because there are only a handful of levers that affect each metric.

Looking to improve cold email open rates?  The key levers that matter are:

  • The subject line and
  • Time of sending your cold email

Not sure if you have an engaging cold email copy? The click rate is a good proxy for whether your email content is interesting enough for your prospect to click and learn more about you. Various levers to improve click rates may be:

  • Introduction of your service/ product
  • Your value proposition/ benefits statement
  • Type of link shared and the call to action on the link.

But let’s say your prospects are, opening and reading your emails. They are clicking and visiting your website or social media links etc, but not responding, then it is time to A/B test your Call to Action (CTA). This is the lever that influences your response rates the most.

Based on whichever metric you are looking to optimise for you are now ready to devise experiments that can test the levers that drive the metric.

The article covers various techniques you can use to A/B test the levers that drive a particular metric. Don’t have time to read the entire article then jump to the metric you want to focus on improving.

Part 1: A/B Tests to Improve Email Open Rates

  1. Test using a personalization variable in the subject line
  2. Test using a question in the subject line
  3. Test sending emails at various times across the day
  4. Test sending emails on various days of the week
  5. Test using lowercase only in the subject line
  6. Test using power words in the subject line
  7. Test using emojis in the subject line

Part 2: A/B Tests to Improve Email Click Rates

  1. Test the number of links that you provide in your cold email
  2. Offer a valuable download vs just visit to learn more
  3. Try sharing different formats of valuable content

Part 3: A/B Tests to Improve Reply Rates

  1. Test by providing a specific date
  2. Reiterating value proposition in CTA vs A simple/usual CTA
  3. Share something valuable vs Permission to share something valuable
  4. Confirmation question vs Open-ended question

Part 4: Best Practices for Running a Cold Email A/B test

Part 5: Moneyball for your Cold Emails

Part 1: A/B Tests To Improve Cold Email Open Rates

cold email open rates

Email Open Rate will let you know how many prospects are reading your emails. This the first hurdle that you need to cross as you march towards cold email mastery. A higher email open rate signifies that your subject line is able to grab your prospects attention and make them zoom in and open your cold email.

A lower email open rate signifies that you will need to work on your subject line or email delivery time.

If you are looking to optimize your open rate, you can try various experiments around the subject line and the time of email delivery.

1. Test Using a Personalization Variable in the Subject Line

Personalizing your email subject line is touted as an effective way to grab your prospect’s attention.

An email subject line that screams, “yes, it’s about you” to the prospect will give them a strong incentive to open and read. Or so the theory goes. But don’t take us at face value. Run a test.

Personalisation does not necessarily mean just adding your prospect’s name to the email subject line. Try adding their company name or their role/position or a particular event/conference in their geography.

Examples of A/B testing your email subject lines with personalization variables:

With these different A/B tests, you should be able to identify which method of personalization works for your prospects

2. Test Using a Question in the Subject Line

Which of these cold email subject lines is your prospect more likely to open?

As against a simple statement, a question in the subject line can be a good attention seeking magnet that speaks directly to your prospect’s need or pain point.

When you word your subject line as a question, you are also subliminally positioning yourself as a solution or answer to the question you have posed in the subject line – which is a great way to start a relationship over email.

Few sample A/B tests with a question in subject lines:

3. Test Sending Emails at Various Times Across the Day

Should you catch your prospect early in the morning, just as they are starting their day. Or later in the evening when they are presumably more relaxed.

It is one of those questions where there are equally powerful arguments to be made for and against either approach.

Let’s look at the data: a study by MailChimp which suggests that 10 AM is the optimal time to send emails. On the other hand, this study by Campaign Monitor suggests that sending your emails later in the evening (post 6 pm) may not be a bad idea.

The best way to figure this out for your prospects is to A/B by scheduling your emails to be sent at different at times of the day.


With this A/B test, you should be able to find the optimal time to send your emails to a different type of prospects and this should result in an increase in your open rates.

4. Test Sending Emails on Various Days of the Week

Just as there are conflicting narratives on the best time to send emails, there are multiple studies on what are the best days to send emails.

Is it on a weekday or on a weekend?

If it’s a weekday, then is it on a Monday or on a Wednesday or on a Friday?

Our experience has been that with senior management/ CXOs the open rates are great on weekends. Run few A/B tests to see if you can find similar trends. You may find SDR’s/Managers opening emails more often on Mondays or VP Sales/Director of Sales opening mails mostly on Fridays. 

A/B Testing idea:

With this A/B test, you should be able to find trends that will indicate the optimal day to send emails to prospects who belong to a particular category.

5. Test Using Lower Case Only in the Subject Line

Have you tried using only small letters in your email subject line? It is a trend that people like Aaron Ross swear by. It makes your email appear more casual and intimate.

When writing subject lines, you are looking for ways to stand out in the inbox and to look different from the other emails. Lowercase only subject lines can help you achieve the “pattern interrupt” effect required for grabbing attention.

Few sample A/B tests with all lower case subject lines:

6. Test Using Power Words in the Subject Line

Do email subject lines that establish the right emotional context get better open rates?

Yes, they do.

Use power words to write subject lines that can orchestrate a certain emotion you want to establish with the prospect – be it building trust or arousing curiosity or making them tingle with fear.

A/B test subject lines with power words that appeal to vanity.

Using the right power words, try projecting the prospect as the hero of your narrative so that you can appeal to their feeling of vanity.

Else, A/B test by using power words that appeal to their sense of fear.

Using power words, remind your prospects of the risks of the uncertainty but in the same breath also provide them with hope.

7. Test Using Emojis in the Subject Line

Emojis are widely popular in this today’s chat first millennials. But will they work in email communication?

While the use of emojis in your email subject line can be a great way to grab your prospects attention, does that translate into more email opens and does this work across all audiences.

Run a few A/B tests to find the answers.

Sample email subject lines with and without emojis:

Though emojis can help you add that bit of creativity to your email subject lines, their actual impact on open rates is something you will need to assess carefully using your A/B test results.

Check if these subject lines are performing well younger audience or older audience. Or if prospects belonging to a particular demography are opening more these emails.

Part 2: A/B Tests To Improve Email Click Rates

AB test cold email click rate

Email Click Rates will tell you how many of your prospects are showing interest in your emails. These are prospects who are clicking on the links in your emails to check your homepage or to download a valuable resource or to watch a demo video, etc.

The email click rate is a good proxy to evaluate whether your email copy is well crafted and relevant to your audience.

Higher email click rates indicate that your cold emails are able to generate enough interest in the prospects and motivate them to learn more about you.

1. Test the Number of Links That You Provide in Your Cold Email

Should you provide just one relevant link in your email that drives traffic to a specific landing page. Or should you litter your cold email with a number of links – including your blog page and your facebook page and LinkedIn page?

A single link is easy to spot and stands out. On the flip side, you will need to make sure that it is something the prospect would be interested in. For eg, providing a link to a valuable resource might interest a prospect whom you are reaching out for the first time or a specific landing page that you have created specifically for your prospect.

On the other hand, with more choices to select from, prospects might be less likely to make any decision at all. So look out for such trends in your A/B test results.

Run an A/B tests to find the optimal number of links in your email.

2. Test the Link Destination That You Provide

Which of these do you think will be more inviting for your prospect to click?

As a marketer, you have a number of destinations where you can drive your visitors towards.

Your landing page, your pricing page, a product comparison page, a special offers page, a resource page, your Facebook/ Linkedin page or a blog page.

The actual destination you send your prospects to will depend on what stage your prospect is in her buying journey and other factors like your unique strengths vis-à-vis competition.

Experiment by providing various link destinations and see which one resonates better for any cohort of prospects.

3. Test the Anchor Text of Your Link

While the actual link destination matters, you can make your links more appealing by playing with the copy of your link text.

Here is a sample A/B test:

One email offers a link to the valuable download and the other email nudges the prospect to learn more about the resource.

With this A/B test, you should be able to learn if providing a link to the valuable download or nudging the prospect to visit the site makes your cold email copy more engaging.

Part 3: A/B Tests To Improve Email Reply Rates

AB test cold email reply rates

Assuming prospects open and read your email, the reply rate of any cold email campaigns is heavily influenced by the Call To Action (CTA). This is the part where you ask the prospect to take a desired action.

Higher email response rates indicate that the Call To Action (CTA) in your email is persuasive enough to make the prospect take the desired action that you want her to.

Run a few A/B tests to iterate on the perfect call to action.

1. Test by Providing a Specific Date and Time vs Asking What Time Works Best for the Prospect

For prospects who are convinced of the value that you provide, the ideal CTA should be to schedule a call with your prospect.

But which of these works better – you deciding on a time and date or letting the prospect decide?

Run A/B tests to check which one results in more response rates.

By providing a specific date and time in your CTA, you are reducing the amount of mental processing power needed for the prospect to reply to the email. All the prospect needs to do is reply with a simple yes/no/alternate date in this case.

By asking the prospect to provide a suitable date and time in the CTA, you are letting her know that you are flexible and you will be available anytime that the prospect decides.

2. Reiterating Value Proposition in CTA

By reiterating your value proposition in the CTA, can you improve your response rates?

Run an A/B test on your CTA to test which has more impact on your response rates.

Few sample A/B tests:

By reiterating your value proposition in the CTA, you are connecting the dots between various parts of your email and summarizing it in one simple and distilled benefit statement for the prospect. But run an A/B test to check whether this boosts the response rates.

3. Share Something Valuable vs Seeking Permission To Share Something Valuable

A great way to engage with your prospects is by sharing something valuable. But is it necessary for you to request permission to share something interesting or do you jump the gun and share it right away.

Conventional wisdom would be to share something valuable to the prospects directly instead of seeking permission. If there is any interest then they will watch it anyway. After all, you are only sharing something valuable without asking for any commitment.

On the other hand, if you can get a prospect to first respond to a smaller request then it improves the chances that they will agree to a second, larger request. So by getting the prospect to respond to your request to share something you have gotten your foot in the door. 

4. Confirmation Question vs Open-Ended Question

Do prospects respond better to a simple confirmation question or to an open-ended question?

A confirmation question will simply require the prospect to respond with a yes/no whereas open-ended question will require your prospects to open up and share more about what is going on with their business.

Both of these types of CTA’s should work well as conversation starters but which one will generate more responses from your prospects?

Here is a sample A/B test for this:

Best Practices for Running a Cold Email A/B Test

Pick Only a Single Variable To Run an A/B Test.

  • While running A/B tests for cold emails, you may be tempted to test many variables simultaneously. But to test the effectiveness of a particular change, it is recommended that you single out one variable and measure its performance. If you are testing more than one variable then it will become difficult to isolate and measure which variable caused a change in performance.

Pick a Sample Size as Large as Possible.

  • How large should the sample size be? While there is no particular number, the higher the number the more accurate and statistically significant the results will be.

Run Your A/B Tests Simultaneously.

  • When running your AB tests, start the tests simultaneously for both the groups. This is to reduce the chance of your results being affected by time-based factors.

Moneyball for Cold Email

Now that you have a playbook to running  A/B Tests, it is time to unleash the power of data and experimentation to improve your cold email game.

Are your open rates too low?

A/B test your subject lines. Pick the prospect group and run a split test with 2 different subject lines. Test to see if throwing a relevant question grabs the attention of the prospect or if adding a personalization variable attracts the prospect.

Use a statistical significant calculator to check if you have a clear winner.

Cracked the code to improving email open rates but still need improve click rates?

Split test your email copy. Test the number of links, the location of links, the destination of each link, the underlying offer/ value of each destination until you can create sufficient interest from prospects in engaging with your email copy and your product/ offering.

Managed to hit stellar open and click rates. On to the final destination. Response rates.

Split test your CTA. Try suggesting specific times slots or requests to connect to the appropriate individual. Split test open-ended vs confirmation questions. Iterate until you get sufficient prospects to hit reply and start a conversation.

Over the last few years, lead generation has transformed from art to science. Effective use of data and experimentation can eliminate a lot of the guesswork and provide you with clear and actionable insights to improve your conversion rates.

Comments are closed.