How to refine and test your subject lines, pre-headers, personalization, and more
It can take a lot of trial and error to get your email marketing strategy delivering the results you want. The last thing you want is a scattershot approach, i.e. throwing out everything you can and seeing what sticks. If you treat each campaign separately or make dramatic changes every time, youâll confuse your audience and miss out on valuable data that could help you improve. So, how do you go about effectively testing your email campaigns?
This is where optimization comes in. In this blog series so far, weâve covered copywriting, email design, compliance issues, and segmentation. However, these are the creative and logistical aspects of email marketing. Now, weâre ready to talk about the strategy behind your emails â and how you can push your campaign results from average to extraordinary.
Email Campaign Best Practices
Before we get into your overall implementation and testing strategy, letâs talk about the little tweaks that can improve your performance. These fall a bit outside the realm of email design and copywriting. Instead, theyâre tactics for improving each emailâs performance across various providers and devices.
Shorten your subject line
In Part II of this blog series, we talked about the importance of brevity in subject lines. We focused on the readability and conciseness of short subject lines, but thereâs another, more technical reason to keep them short and sweet: most email clients will only show a certain number of characters.
Desktop clients may show up to 78-80 characters of a subject line, but this depends on how wide the user has set their message pane, the size of their computer screen, etc.
Mobile clients typically show between 38 and 42 characters of a subject line. As you see, this is about half what youâd see on desktop!
So, you canât be sure how much of your subject line your recipients are seeing. If theyâre checking email on their phone, they could miss out on half your expertly written copy!
The solution, of course, is to keep subject lines as short as possible â usually no more than 35 to 40 characters. The optimal count for email subject lines is 41 characters or 7 words. This gives you enough space to entice opens without losing valuable information.
Check your code when testing your email campaigns
Especially when youâre first building your sender reputation, you canât afford any mistakes. Poorly designed emails are often flagged as spam â and if your emails are not user-friendly, your recipients may mark them as spam as well. The more reports you get, the more your sender reputation declines, which could lead to your domain being blacklisted from certain email providers.
Play it safe. Before you ever send your first campaign, ensure that your emails are âcleanâ from top to bottom. This means avoiding the spam trigger words we listed in Part II of this series, adding your GDPR- and CAN-SPAM-compliant unsubscribe link, and checking your emailâs code for errors.
Common coding and formatting problems that can trigger spam filters include:
- Too many different colors and fonts
- Subject lines with all uppercase letters
- Hyperlinks to problematic websites (especially for images)
- Cluttered code with HTML errors
Donât fret too much if you have a misplaced <div> tag; a minor error wonât land you in the spam box. But when itâs so easy to validate your code and check for any simple mistakes, thereâs no point in skipping these checks before you hit âsend.â
Use emoji in your subject line â but not too much
Multiple studies have shown that most marketing emails do not contain emoji in their subject lines â only about five to seven percent. This can make emoji seem like an easy way to grab your audienceâs attention. If most of your competition isnât using emoji, shouldnât you include them to stand out?
Not necessarily. First, itâs important to note that emoji appear in different forms across various platforms. Some email clients, such as Gmail, will convert the emoji into a different image that doesnât look as good. There are significant differences between Mac/iOS emojis and their Android counterparts as well.
This means you should use the Unicode symbols whenever possible to ensure cross-platform compatibility. Unfortunately, these sometimes donât look as good as a platform-specific emoji.
The bigger problem is that emoji may give an unprofessional or spammy vibe, especially for marketing emails. Some research has found that using emoji actually hurt open rates and increased the number of abuse reports for legitimate emails. Others have noted that emoji may boost the open rate slightly â but only if the subject line is already compelling.
In other words, always make sure your copy is well written before adding an emoji. Avoid using an emoji to replace a word. And if emoji doesnât suit your campaignâs purpose, itâs better to skip them altogether. (Any administrative email, such as an invoice notice, must seem professional and therefore isnât a good choice for emoji.)
Ultimately, it depends on your brand. We recommend:
- only using emojis in campaigns sent to well-established audiences or segments with a high level of engagement
- testing your emoji to ensure they appear similarly across providers and devices
- only using emojis to enhance subject lines, and avoid them if they donât add anything to your copy
- choosing emoji that express emotion or symbolize a benefit, such as one of the many face emoji or the âstrong arm,â rather than generic or alarming symbols such as warning signs or exclamation points
When deciding whether or not to use emoji, or which emoji to choose, itâs important to test variations. This brings us to the bread-and-butter of email strategy improvement: A/B testing.
Secrets of A/B Testing Your Email Campaigns
Itâs rare to get an email campaign perfect the first time you send it out. No matter how well you understand your audience or craft your creative, there will always be something to tweak. Factors as small as the color of your CTA button can impact your results.
This is where A/B testing comes in. For each test, choose one factor and create two versions of it. For example, you can test to see if a red or blue CTA button delivers a higher clickthrough rate. You can also test variations in the copy, personalization, and header image.
Before you start testing, note that you need an audience of at least 600 to yield statistically significant results. If youâre not there yet, though, thereâs no harm in developing a testing methodology so you can get into the habit. Once you start testing regularly, youâll find that a simple automated test gives lots of valuable insights. By using the winning test factor, you can potentially boost your open and clickthrough rates by up to 10 percent!
Factors to include in A/B testing
Most marketers start by testing the subject line. After all, this is what convinces your recipients to open your email! Itâs a good idea to A/B test all your subject lines until you know which characteristics perform best.
Your marketing automation platform can A/B test for you. This typically involves sending your campaign to only part of your audience (usually 20%). Half this portion sees version A; the other sees version B. The system measures which version yields a higher open rate (or whatever your desired metric), then sends the campaign with the winning version to the rest of your audience.
Again, you need at least 600 recipients to gain valuable insights from an A/B test, so be sure that the testing portion of your audience is large enough. If your list is only 1,200 people, for example, 20% is too small. Split your audience 50-50 and test them against each other.
Subject line characteristics you can A/B test:
- inclusion of emoji or not
- two different emoji
- personalization or not (learn more about personalization options and best practices in Part I of this series)
- mention of a number or price
- copywriting style (check out our tips for good subject line copywriting in Part II of this series):
- question vs. statement
- active vs. passive voice
- urgent or not
Email design characteristics you can A/B test:
- header image
- header copy
- personalization or not
- question vs. statement
- body copy
- personalization or not
- line spacing
- placement of CTA button
- color of CTA button
- single-column vs. grid
Developing your testing strategy
You can only A/B test one factor at a time, which means youâre limited in how many times you can test a given campaign. You donât want to send duplicate campaigns to your list, which is why itâs crucial to segment your list first. (Learn more about audience segmentation in Part III of this series.) Then, choose your testing factors and how each test will lead into the next.
To decide which initial factor to test, determine your campaignâs primary goal. Do you want recipients to clickthrough to a landing page? If so, youâll want to A/B test your CTA copy, color, and/or placement (but only one variable at a time). Are you sending a general engagement campaign and just want a good open rate? A/B test various aspects of your subject line (again, one at a time).
Letâs be clear: only one testing factor at a time!
Be sure that you are not introducing multiple variables into your two subject lines. For example, if you want to test whether or not emoji will boost the open rate, that should be the only difference between your A and B subject lines. If you also change the language in version B or add personalization, youâve just introduced two more testing variables, and your results will be unclear.
Step 1: Make your hypothesis.
Once youâve decided on your testing variable, develop a hypothesis thatâs easily proven or disproved. For example, you may predict that a red-colored CTA button will get more clicks than a blue-colored one. This is quite easy to test; either it does, or it doesnât. Or, the difference is insignificant (more on that in a moment).
Step 2. Set your goal.
Decide which results would prove or disprove your hypothesis. This goes hand-in-hand with a campaign goal (usually the value of a specific metric). Before setting your goals, establish your baseline. If you havenât started your email marketing strategy yet, you likely wonât have this. Thatâs okay; you can use the comparable benchmark for your industry.
Then, choose a metric to test by. Perhaps youâd like to boost your clickthrough rate by 10%. Youâre A/B testing to try to achieve that rate. Even if you donât meet that goal, itâs worth noting as you continue to test, so you can see which factors make more of an impact on your desired results.
For example, letâs say you predict that a red-colored CTA button will perform better than a blue one, and you aim to boost your clickthrough rate by 10% compared to your baseline. Your A/B test reveals that your red CTA gets more clicks, but only 5% more than your baseline. Now you know that you should use a red CTA, but you also need to improve another factor to reach your goal.
Step 2: Choose what you will do with your test results.
Whether your hypothesis was correct and your red CTA got more clicks, or your blue CTA actually performed better, youâll want to keep the winning color for all future campaigns. You can now test other factors, such as the placement of the button or the headline that appears above it. As long as you only test one factor at once, you can continually test in a stepwise fashion.
If neither color performed better, you have two choices. You can A/B test another key factor in your clickthrough rates, or you can test two other colors. Usually, though, youâll want to choose another factor that impacts clickthrough, such as the actual copy of the button or where itâs placed in the email. Those factors may outweigh the impact of the button color. Donât get too caught up in testing a particular factor if youâre not seeing significant variation between the A and B variables.
Step 3. Test and document regularly.
Throughout your testing scheme, document your results, including the winning factor and its effect on your metrics. Over time, youâll see a pattern emerge, e.g. red CTAs always perform better, but you only meet your clickthrough goal if they have concise button copy as well.
Keep detailed notes on the hypothesis, goal, and results of each test. After several months of testing, youâll have a good idea of what works for your audience. For example, you may determine that you get the best clickthrough rates when you use a red-colored CTA button with very short copy at the top of your email design.
That said, testing never truly ends (sorry!). As your audience shifts and expands, so will their preferences. You may also see different trends in various types of campaigns or for varying audience segments. Thatâs why itâs so important to document everything so you can make informed decisions for each campaign.
Wrapping Up
Email marketing success is hardly cut-and-dry. What works well for one brand may not work well for you. As you develop your strategy, youâll find the right combination of copy, design, structure, and testing that works for you. The key is to have a system that incorporates the best practices weâve discussed, consistent copywriting and design, full compliance with customer privacy laws, and a robust testing methodology. Whew! Donât worry, itâs doable.
A marketing automation platform such as SharpSpring makes it easier to manage and test your email strategy. Learn more about our features or book a demo today.