Small changes to your website can have a significant impact on your lead generation results, but relying on gut feeling, marketing smarts and past experience alone to direct optimization efforts doesn’t inspire confidence in any metrics-driven business. Enter A/B testing, a simple and fast way to learn what changes will make a measurable difference to your website traffic and lead conversion. This article gives a quick definition, tells the truth about who shouldn’t use it, and offers 10 guidelines so you can start your A/B testing at the head of the class.
What is an A/B test?
Also known as a “split test,” an A/B test is a way to determine which version of your marketing content (think: call-to-action button, ad headline, hero image, form, body copy) performs the best. The name “A/B” comes from the tendency to pit two versions of the same asset against each other to see which one wins. A big advantage of A/B testing is decision-making that’s based on actual customer behaviour, not on guesses (no matter how educated we may think they are). It can be an important way to improve lead generation.
Is A/B testing for everyone?
Ummm, no. As Peep Laja, founder of ConvertXL, says, if you have fewer than 1000 conversions on whatever you want to test, A/B testing probably isn’t worth it. That’s because you won’t have the volume of data required to know whether the results actually mean anything. Let’s say you get 260 conversions on one version of a CTA and 240 on another. Yes, the first version got 8% more conversions. But that difference could just as easily be due to chance as to a true preference for that CTA, since both the sample size and percentage difference are small.
10 guidelines for effective A/B testing
If you have the volume of conversions to make A/B testing worthwhile, these practices will help you get the most from your efforts.
#1 Conduct one test at a time. Start with one asset, such as a landing page or an ad or an email. If you change more than one asset at a time, you won’t know which change resulted in an increase or decrease in performance. Focus your efforts so you'll know where you’re making an impact.
#2 Test one variable at a time. Isolate one element for your test, make a notable change, measure the impact, and decide whether or not the change was successful based on the results of your test.
For example, if your asset is a landing page, don’t change the form and the CTA or the headline and the image during the same test. Choose one element—let’s say the number of fields on the form—and test that. Once you’ve acted on the results from that test, you can test a variation of the form headline or the colour of the submit button.
The one element that you're testing could be the entire page, however, testing one variation against another. This high-level A/B test can yield dramatic results.
#3 Test what matters. There are thousands of A/B tests you could perform to improve your lead generation, but most of them won’t make a meaningful difference to your bottom line.
The key to ROI when it comes to A/B testing is to be strategic about the assets you test. Use other data (Google Analytics, heat mapping tools, user surveys, even intel from your sales team) to determine which pages are the most important to your sales process and do tests on them first.
Once you’ve performed a test on one element—let’s say the headline of a priority landing page—and the impact on conversions is negligible at best, don’t try another headline. Move on to A/B testing the form or the CTA to see if you can improve those results.
#4 Set up a control and treatment. Always test against the original element (called “the control”) so you know whether what you had was working. The option you expect to perform better is called “the treatment.” If you ignore your original and test two new treatments, you’ll never know if leaving things as they were was the best option.
#5 Set a time frame. Set a time frame that captures the variability of time of day, day of week and even time of month. Make the time period long enough to reach a statistically significant sample size, but not so long that external anomalies, such as holidays and seasons, will affect user behaviour and give you inconsistent results. Also avoid testing during important events, for example, Christmas or—for B2B sales—fiscal year-end. Two to four weeks is often long enough, provided you can be confident your sample size is large enough (see #9).
#6 Measure the impact on sales. Sure, your A/B test might have a positive impact on your landing page conversion rate, but how about your sales numbers? Look further into your sales funnel for evidence that that option A or B is performing better. Consider metrics such as visits, click-through rates, leads, traffic-to-lead conversion rates, demo requests and sales. It could be that a landing page that got fewer conversions actually resulted in more sales, perhaps because the leads were better qualified, or because the highly clicked page over-promised and then disappointed.
#7 Split your sample group randomly. In order to achieve conclusive results, you need to test with two or more audiences that are equal. If you’re testing a variable on a landing page but the traffic to your control page comes mostly from organic search and the traffic to your treatment page is almost all paid traffic, you may make assumptions about the performance of each that are based on who came to the page and not their experience when they arrived.
#8 Test at the same time. A/B testing requires you to run the two variations at the same time. Without simultaneous testing, you may be left second-guessing whether your results were due to some external factor that was in play during one testing time but not the other. Was it the time of year? The weather? An important local, national or global event? Control for these variations by testing at the same time.
#9 Define what constitutes a “significant” result. Interpreting the results of A/B testing is about math. (Sorry.) Before you begin, ask yourself if the asset you want to test will get enough views or clicks in a short enough period of time to even warrant testing (like Peep says, aim for 1000). If it will, use A/B testing statistics to make sure you’re interpreting the results correctly and can use them to make data-informed decisions. Of course, there are testing tools that will do the math for you. Use them properly so you don’t inadvertently claim (or ignore) a winner.
#10 Keep testing. Optimization is a never-ending exercise. In fact, the best digital marketing companies have an entire team dedicated to tweaking, testing and taking action based on results. When an A/B test is inconclusive—the results aren’t significant enough to take any action—you can test another variation to see if it makes a difference. When you’ve increased the performance of a CTA by changing one element, try changing another. When you’ve optimized the heck out of the CTA, move onto the headline.
Apply these 10 A/B testing guidelines when you want to boost website traffic and get more leads. You’ll have real numbers to help you make decisions, and intel that you can use for future ads, landing pages, offers, emails and more. Oh, and in case you were wondering, you’re actually part of an A/B test right now. This same blog post is being shown to some website visitors with a different call-to-action. Which one will get more clicks? Only time will tell!