How to Write the Perfect Email Signup Text (Using Data)

By Shaun Tinney
4 minute read | Last Updated: June 24, 2016

Just because something works for one brand doesn’t mean it will work for another.

Writing an effective call to action (CTA) on your site is a difficult task. It’s easy to find lists of what has worked well for someone else. It’s even easy to find test-backed, tried-and-true winners. But what you can’t Google is whether or not those winning headlines and CTAs will work for your brand. Searching for “the most effective email signup CTAs” is a fine way to find ideas, but the only way to determine the best copy for your brand is to test it.

Let’s look at the humble email newsletter email signup box common to so many site footers. You know, the email box with a headline and a button:

Email Signup Box

We ran tests on the same combinations of elements across three brands, experimenting with:

  • The call-to-action headline
  • The copy on the submit button

The variations for the headline were:

  • (Original) — whatever the brand was already using
  • (Sign Up) — “Sign up for exclusive email offers”
  • (Get) — “Get exclusive email offers”

The variations for the button were:

  • (Original) — whatever the brand was already using
  • “Sign Up”
  • “Submit”

Take a look at the impact these changes had on each brand below. The solid color boxes indicate statistically conclusive results of 99% or greater.

Brand #1 Results

Almost every combination for the email signup box created some lift in sign ups. Only the “Sign up for exclusive email offers” and “Sign Up” button combo dropped sign ups by about -8%. The absolute winner was the headline “Get exclusive email offers” and “Submit” button combo.

Email Tests 3

Based on these very promising results, we set up the same email signup test for two additional brands of a similar size within the same industry as the first brand.

Brand #2 Results

The email signup tests for this brand showed a decline across the board, with the singular and surprising exception of the “Get exclusive email offers” and “Submit” button combo boosting signups by +88%. The same headline with the “Sign Up” call to action on the button decreased sign ups by -38%.

Compared to Brand #1, these results are unexpected, but actionable.

Email Tests 1

Brand #3 Results

After two very strong showings from the same set of headline and button combinations for the email signups, we were pretty confident that we’d see similar results from the same experiment with a third brand. You can imagine our surprise when we found that every variation performed worse than the original, three of them with 99.99% statistical significance. Ouch.

Email Tests 2

Results like these are not uncommon, but they highlight just how important it is to test site changes rather than just agreeing on them internally and never following up after launch.

One more variable that we didn’t include in these tests is button color. If you search for the “best button color to increase conversions” red will come up a lot as the popular choice. This doesn’t mean you should just make your important buttons red. Just as each of the brands listed above saw wildly different email sign ups based simply on headline and button copy changes, your results will vary.

With color changes, the idea is to test making a button more noticeable as a point of action. Sometimes that means simply making it a different color than everything else on the page. If your whole brand motif is centered on red, going with the popular advice to use red for buttons won’t really help you. If blends in, people are less likely to notice it.

A Better Way

If you’ve been relying on the results of someone else’s tests, or simply internal stakeholder consensus to make decisions on your site, we’d like to propose a new approach:

  1. Start with the research and best practices
  2. Customize everything for your brand
  3. Test each version with your customers
  4. Launch the winners that your customers choose

The most important lesson to take from the results of these tests is that there is no single “right” answer as to what will increase conversions on your site. Making site changes based on best practices is great, but ultimately the only one who can choose the best version of a site feature is your customer.

Don’t be afraid of an experiment “failing.” The possibility for something to perform better or worse than an existing element is built in to the testing and iteration process. In fact, it’s the only way to guarantee better results in the future.

About the author: