how to increase website conversions

How A/B Testing Tools Work (A Guide for Marketers)

The ultimate goal of an A/B test is to build on the learnings from previous experiments and use those insights improve the pages being tested. Here's where to begin.

A/B testing tools can seem daunting, even though the concept is simple.

In an A/B test, you’re showing two versions of a web page — or even smaller changes within a specific web page — to visitors. Half see one variant, half see the other.

You determine the effectiveness of each by observing visitor actions. The variant that prompts them to do what you want them to do most often (buy something, sign up for something, click a link, etc.) is the winner.

That process — testing one variation of a web page against another — can continue indefinitely. There’s always room for improvement and new items to test. A/B testing is a direct method of field-testing your ideas.

But how do you set up A/B testing on your ecommerce site? Do you need to subscribe to a service, or can you do it for yourself? How complicated is the backend process?

That’s exactly the kind of information we’re going to cover here.

How do A/B testing tools work?

At the guts of every A/B testing tool is a snippet of code that decides which variation of a particular web page each visitor will view. In its most simple form, half see the “version A” and half see the “version B”.

The code can either be placed server-side or client-side, but the majority is done client-side via Javascript using a testing tool like VWO, Optimizely, or one of the dozens of testing tools on the market today.

Both perform exactly the same: some visitors see version A, the others see version B. Here’s what the process looks like:

  1. You add a snippet of Javascript code (in your website header like Google Analytics) for the platform you are running.
  2. You specify the URL that you will be running the A/B test on. This will tell the tool where to cookie the visitors who are being opted into your experiment.
  3. Once you’ve entered the URL, you build out the experiment variation. This is where you would add/remove/move elements from your original design (control).
  4. After you’ve created the variant, you can then specify the goals you want to measure. What metrics will tell you whether your new variant was a success?
  5. Next, when you begin the campaign, the tool will begin diverting traffic to variant B (or others if you’ve decided to run a multivariate test).
  6. Depending on the variant the visitor is opted into, the web page code is altered by the testing tool’s Javascript code before it is rendered in the visitor’s browser.
  7. Finally, the testing tool’s cookie tracks which variant the visitor was opted into and whether they take specific actions to be counted against the test goal.
  8. From this point on, you then need to wait for your test to reach enough results to declare an objective winning variant.

The ultimate goal behind these A/B tests is to run multiple subsequent tests built on the learnings from the previous experiment and use those insights to make incremental improvements in the pages being tested.

Whereas user testing measures how easily a user can achieve their own goals (find a product, select a color, check out and arrange shipping, etc.), A/B testing focuses on getting visitors to take actions you want them to take.

That’s a significant difference.

They are both useful for conversion rate optimization, but it’s important not to mistake one for the other. The effectiveness of your testing will hinge on your wise use of strategy.

a/b testing tools

Why is A/B testing important for conversion rate optimization?

It’s tempting for ecommerce managers to rely solely on personal experience and input from the marketing team when considering web page design choices. You call a meeting, consider the options, then choose the winner by either group consensus or by making a gut-level management decision.

That’s fine for getting started, but the real question is will it work with real customers? What happens when the web page goes live? Will visitors to your site understand what to do next — and will they do it?

That’s where A/B testing comes in. You let the users show you which variation is more effective. They don’t tell you, they show you with their actions.

Here are three primary reasons why A/B testing tools are vital:

  1. A/B testing tools bring objectivity to your marketing decisions. What you think will be best may not be the best. A/B testing will uncover the truth.
  2. A/B testing tools allow you to make one small change after another, seeking to get better results with each variation. Before long, you may look back to see those little incremental wins added together have produced a big win.
  3. A/B testing tools make it possible for you to perform an ecommerce website redesign one small chunk at a time. This reduces (maybe even eliminates) the chance you’ll end up with something your users don’t like.

A/B split testing provides data directly from your visitors. Unlike surveys, where you might ask prospects about their preferences, A/B testing tools make it possible for you to see what they do when given two options.

Does one illustration work better than another?

Could your navigation be organized better?

Should that call-to-action button be big or small, red or blue, round or rectangular?

Do your visitors buy more when given one product description style over another?

A/B testing will tell you. A/B testing is for those who prefer objective data over theory, and sales over discussions about sales.

a/b testing tools

TreeRing increases revenue by 53 percent: a case study

To get at the A/B testing process, let’s use a real-life example:

TreeRing is in the school yearbook business. The company’s unique social-first approach to capturing school memories gets TreeRing plenty of notice from students, parents, and school staff members. Despite all the attention, though, TreeRing’s conversion rate was not as high as they would have liked it to be. Consequently, revenue goals weren’t being met.

After trying on their own to turn sales around, TreeRing management called The Good for help.

By using a conversion audit to gain insight on where stuck points and bottlenecks existed along the path to sales, the conversion optimization team from The Good identified areas of opportunity, developed a detailed testing roadmap, and launched a robust regimen of A/B testing aimed at converting more of the existing visitors to customers.

In one A/B test series, for instance, the team experimented with the location of the “Request Free Sample” link. Those tests alone resulted in a 42 percent increase in visits to the Free Sample page and boosted conversions by 12 percent.

Following the initial round of A/B tests — just three months after beginning the process — year to year revenue for the quarter was up by 53 percent!

Here are observations made and the steps taken to ignite TreeRing revenue:

  • The TreeRing ecommerce website was drawing plenty of traffic, but not enough sales
  • Management asked The Good to perform a conversion audit on the site
  • Once stuck points were identified and areas of significant growth opportunity mapped out, we added a small snippet of javascript to the backend code (to enable A/B testing)
  • The conversion optimization team from The Good designed two versions of the landing page judged to be most detrimental to sales
  • A/B testing served the “A version” to half the visitors, and the “B version” to the other half
  • After statistical significance was reached to indicate enough visitors had been evaluated to make the test results definitive, the winning variation became the “A version” for the next test
  • The team kept testing at each point until the A/B testing gaps tightened, and clear winners were not so readily identified
  • Given the rather spectacular gains realized in the first round of experiments, TreeRing management decided to continue on the path of UX testing and conversion rate optimization work (including regular reliance on A/B testing tools)
  • Return on investment continues to grow and revenue goals are being met

You can read the entire case study here: TreeRing Conversion Growth.

But is the TreeRing case a fair representation of the gains an ecommerce website can expect to receive from a rigorous conversion optimization program utilizing A/B testing tools?

In our years of hands-on CRO work, we’ve seen results that weren’t as dramatic as those which TreeRing realized, and we’ve seen companies pull even greater gains. Much depends on the starting point and how much room for improvement exists at the gate.

One thing is constant: any company can benefit from the right application of an A/B testing protocol.

a/b testing tools

Where should you go from here?

A/B split testing isn’t complicated, but it does take some technical knowledge. You can have your backend developer install the javascript or subscribe to a service that will do it for you. The process need not be expensive.

Most crucial is that you identify areas where your path to sales is losing prospects, that you develop theories about how to free up those stuck points to keep your visitors moving along the road you’ve set out for them, and that you test those theories to prove or disprove them.

Use your best judgment, but don’t rely on it—test it.

Would it be helpful to have an experienced hand guide you?

To speak with a CRO team member from The Good, all you need to do is contact us. We’re always happy to take a look at your company’s situation and point you in the right direction.

To find out more about A/B testing tools in general, check these resources:

Enjoying this article?

Subscribe to our newsletter, Good Question, to get insights like this sent straight to your inbox every week.

About the Author

Rudy Klobas

Rudy Klobas is a former Content Marketer at The Good. He regularly works to produce insightful, informative content and copywriting designed to help digital leaders improve the user experience.