Using Split A/B Tests & CRM to Increase Email Clicks & Open Rates
Marketing automation providers often talk about how their solution takes guesswork out of email marketing when its connected with CRM.
A great example of this in action is Split A/B testing.
A Split Test allows you to send two or more variations (Version A and Version B) of your email to a sample of your CRM marketing list.
These recipients interactions with the email, either opening the message or clicking a link, determines the ‘winning’ version which will then be used as the basis for the remaining emails that are sent to this list.
Individuals react in different ways to the same message and what worked well with an audience for one marketing list might not necessarily perform successfully with another.
There are numerous factors behind the success of an email campaign but three critical elements can be assessed for each email campaign using a Split A/B test in CRM.
Rather than agonising over the format of your email subject title and then putting all your eggs into one basket with a single email subject you can mitigate the risk of a poor open rate by testing a couple of email subjects to evaluate which one works best.
Is is better to have your email sent in the company name, or from recognisable person name?
There’s no limit to what you can test with entirely different email templates!
This can include adjusting the layout, altering the tone of the message, applying dynamic content using CRM data, or varying the calls to action to understand which format results in the best click through rate.
I’ll take a closer look at each of these elements to demonstrate how they can be managed in Microsoft Dynamics CRM starting with the email content...
We’ve created two versions of an email newsletter:
Version A is mainly text based while Version B has a different layout featuring more images.
In this scenario an organisation has previously sent emails that reflect the layout in Version A.
To improve email click through's they’ve created a new template design, Version B which they believe might result in more email clicks but they can’t be certain it'll perform better than Version A.
Split testing removes uncertainty by enabling both templates to be used with the next campaign to understand which one really does contribute the best CTR with the target audience rather than making a guess.
For example, if we have a marketing list of 4000 contacts we can put 2000 (50%) into a split test.
With this test 2000 random contacts in our CRM marketing list are placed into the test, 1000 of these will receive Version A of our template and the remaining half of our split test contact will receive Version B.
Both email versions will be sent at the same time when the campaign is scheduled.
A split test can be evaluated over an hour or two but to ensure it’s a fully representative analysis I’ll run this over 24 hours.
That means at the same time next day CRM and our marketing automation service will assess if Version A or B resulted in the best click through rate and the 'winning' format will be applied to send emails to the remaining 2000 contacts in this list.
A 50% split is a relatively large test. Depending on the number of contacts or leads in your marketing list a smaller total test sample of 10-40% maybe more appropriate.
It is usually recommend not to test too many elements in each email campaign otherwise it’s hard to understand which factor was most significant which is useful when it comes to applying this insight to plan the future campaigns.
However, for the purposes of this example I’ll also test two variations of the email subject line.
Email subject tests can involve titles with:
• Different lengths e.g. Short Subject vs Long
• Dynamic content using CRM data, for example prefix with a contact's first name or insert an expiry date
• Different calls to action
• Emails that ask a questions
Get some free email subject ideas from OptinMonster for each message type and industry.
Using Click Dimensions and Microsoft Dynamics CRM I've structured a split test with two variations. Version B will test a different subject title and an alternative newsletter design compared to Version A:
Once these variables have been confirmed the precise testing percentages can be defined.
Using the earlier example I'll put 50% of our audience into this split test, half of this test will receive emails for Strategy A and the other half Strategy B.
For this test I want to maximise the Click Through Rate so this will be selected as the criteria, in another scenario the split test might evaluate the email Open Rate.
The great thing about marketing automation integration with CRM is that this test is fully automated.
For this example, Click Dimensions will select a winner between Strategy A and B after 24 hours so if we schedule this email for 11am next Tuesday the split test email will be sent then. The split test will end at 11am on Wednesday at which time the winning strategy will be confirmed and this will be automatically applied for the remaining emails that are sent to this audience:
Once every email in this campaign has been sent the Split Test results are confirmed enabling marketing teams to understand what worked well so this insight can be replicated for future campaigns.
In this example, both versions resulted in good open rates but Version B proved significantly more successful in its click through rate so as a result this strategy was used for the remaining emails sent:
Individual A/B Test actions are also written back to Microsoft Dynamics CRM records by Click Dimensions which means you can dig even deeper to understand how different segments of your audience reacted. For example, did one gender or age group react better to a subject title on one version?
As shown below the split strategy is highlighted in CRM for each email sent, as well as every email action resulting from an A/B Split Test:
As mentioned above, the email sender name can also be used as the basis for a split.
In the example below I've defined a split test with one version using a company name as the sender and the other message being sent from the business owner. This split test also uses different email reflecting the different sender personalisation.
The latter is more personalised so it might result in a better open rate but by running an A/B Test in CRM we can be certain which will work best for a specific audience rather than committing all of our emails to one format in the hope we've made the best choice.
In this example I've focused on Click Dimensions which supports a maximum of two split test variations.
Dotmailer handles split test in a similar manner but enables more than two split test strategies so if required this would support multiple versions, for example to test three or more different email subject titles: