A/B Experimenting: Perform Tests That Can’t Be Ignored

AB testing

By:

Posted: October 2, 2013 | Testing

If you want to convince people to change the way they do something, you’d better make a solid argument as to why. That’s a lesson I’ve learned from A/B testing our email marketing campaigns and landing pages here at Marketo. If you can’t defend your test’s validity, pinpoint your variables, and back your results with statistical analysis, nobody will adopt the changes you propose.

So what’s the best way to convince people? Treat your A/B tests like the experiments that they are. Outline your experiment, compile your data, and let the numbers speak for themselves. Need a refresher course in statistical significance? Check out our ultimate guide to test statistics, which will help you explain your results clearly and with confidence — even if your team members are allergic to math.

If your company has been shrugging off your results as coincidental or not applicable, here’s how to frame and present your email and landing page A/B tests:

1)  Isolate a single variable as your test variable, and identify your control

Your control is the unchanged version of your email or landing page – the version you’ve already been using. You’ll want to isolate one variable of the control to isolate and change; this should be whichever variable you think is affecting results. When it’s time to present your findings, it will be crucial that you left all other variables unchanged. By only changing and testing one variable per A/B test, you can more precisely attribute any difference in results.

I recently conducted a simple email marketing A/B test, in which I changed a single aspect of our “From” address. In this test, I used our usual “From” address (“Marketo Premium Content”) as my control, and tried a personal “From” address on half of my emails:

Control:

From address NOT personal

Test:

 From address personal

With approximately 1000 more opens and 500 more clicks, the personal address was much more successful. Our confidence level was 99%, meaning that our results were 99% reliable, rather than due to chance. For a test like this, a confidence level above 95% is commonly considered “statistically significant.” And because I’d isolated only one factor, I was able to easily identify the reason for the higher number of clicks. Marketo now uses personal names in our email mailings whenever we can.

2)  Formulate a hypothesis

Before you test, make a prediction about what performance results you believe may be influenced by the test, how they will be affected, and why. It’s important to be open to any results – this will help you convince your team that your results are scientific, rather than based on your intuition. For example, I recently ran an A/B test on two versions of a follow-up email – one with a banner image, and one without.

follow up email campaigns A B test

My hypothesis was that the email with a banner image (on the right) would have a higher click-to-open rate. The banner makes the email more dynamic and exciting, and gives the reader a second “Download Now” button. In fact, the email without a banner showed a 12% increase in the click-to-open rate (with a 99% confidence). The results were surprising, but compelling.

3)  Use a large enough sample size

The larger the sample size, the stronger your results will be. That being said, you obviously want to use the winning version as soon as you can, and if you start too big, you’ll have half of a huge group receiving a less effective email. The best practice is to start out on the smaller side, keeping in mind that you can always repeat the email test if your company needs extra convincing. Or you can leave the landing page test running a little longer to build the sample size.

Unsure of how many tests you should run? Find out how many versions you should run, or the number of days you should run each version, with our Landing Page Split Calculator.

4)  Eliminate the possibility of confounding variables

Be sure to control for any extraneous factors that could be affecting the results. This takes Step 1 (isolating a single variable) into more depth. To reduce these factors, you don’t just need to leave your control unchanged – you also need to send your test emails at the same time, have your two versions of the landing page running simultaneously, and make sure that the test is randomized. For example, if you’re using Marketo, you can send the two versions of your email to a random sample of 50% of your designated emailing list. You can also have the traffic to your landing page URL split equally to the two different versions of your landing page.

5)  Draw appropriate conclusions

Now it’s time to take a look at the results and see what you can learn. First, determine whether a statistically significant difference exists between your two versions. Instead of performing the significance tests yourself, you can simply search online for “A/B testing significance calculator, or “email testing significance calculator.” It’s never a bad idea to use two different online calculators to double check your results.

For instance, recently I ran a test to determine the best CTA placement in an email. In this subtle but effective test, I found that the left-aligned CTA button version had 90 more clicks than the center-aligned CTA button. I used a calculator to determine that these results had a 95% confidence score, making the left-aligned CTA button a clear winner.

Call to action center versus left

A confidence level of 95% or more is a great threshold for picking a winner. If your results weren’t significant, consider trying again with a larger sample size.  If your results were significant, you should carefully identify what they can be applied to in the future. For example, the results of a test for a tradeshow follow up email may not apply to an email invitation for a webinar.

To Summarize:

Your goal is to make sure that your testing efforts aren’t in vain, and to improve the performance of your team’s emails and landing pages. When you’ve eliminated all confounding variables, can back up your data with a significant pool, and have considered an appropriate response, you’ll be able to present your findings with confidence.

If you can say, “The test version of our webinar registration landing showed a significantly higher conversion rate than the control, with a 99% confidence level,” people will pay attention to your recommendation. When it comes to A/B testing, it’s hard to argue with a scientific approach.

Related Resources

Miles Gotcher is a Marketing Programs Coordinator at Marketo, where he supports program efforts for the Demand Generation team while leading the A/B testing of marketing emails and landing pages for Marketo. Miles graduated from Washington and Lee University with a B.A. in Psychology.

Read Miles's Blogs

Trouble backing up your #ABtests? Think like a scientist! Here's how to present your AB results.

Follow Us

Most Shared

Right Brain vs Left Brain Marketers Infographic

The Right Brain vs. Left Brain of Marketers [Infographic]

true colors feat

True Colors: What Your Brand Colors Say About Your Business [Infographic]

Instagram Infographic

What Your Instagram Filter Says About You [Infographic]

Visual Gold 2

Visual Gold! The New Revolution of Content Marketing [Infographic]

25 Must-Read B2B Marketing Blog Posts

Top Articles

Tighten your belt!

7 Strategies for B2B Marketing during a Recession: The Definitive Guide

Evolution

The Evolution of Modern Marketing Automation [Infographic]

promo_top_b2b_blog_med

Big List of B2B Marketing Blogs

The ROI of Paid Social Media Advertising

The ROI of Paid Social Media Ads

100th blog post

Marketo’s 1,000th Blog Post: Our Modern Marketing Definition Revisited