two people working on rapid validation

What is Rapid Testing and How is it Different from A/B Testing?

We’re sharing the true meaning of data-backed decision-making and how rapid testing can help you get there.

A/B testing has long been hailed as the cornerstone of data-driven decision-making.

But in our 15 years specializing in conversion rate optimization, we have seen the sentiment around A/B testing go from curious to exuberant to even frustrated at some organizations.

There is a changing attitude towards the once-celebrated practice, and many businesses are entering an era of testing fatigue.

model showing how a/b validation works

The “test everything” mantra is sung far and wide by experimentation experts, but that approach simply won’t work for most companies.

As companies strive to optimize user experiences, they are encountering pitfalls inherent to A/B testing. Instead of continuing deeper into the trough of disillusionment, I want to share some alternatives that promise faster, cheaper, and more satisfying outcomes.

The State of Data-Driven

The popularity of search terms like “data-driven” are at an all-time high.

graph showing search results for data-driven

If your media consumption looks anything like mine, you’ve probably read countless articles about all the ways in which tech leaders leverage data to make informed decisions.

With the advent of analytics platforms, tracking pixels, and attribution modeling, it seems like everyone is an analyst. Or at least they play one in meetings.

As Stanford’s Ramesh Johari notes, “100 years ago, data was expensive and slow and practitioners were trained statisticians. Today, data is cheap and real-time and everyone is a practitioner.”

With the increased access to data, leaders are antsy to put that data to work. We regularly speak with leaders who want to transform their company’s relationship to their data: they run heat mapping tools on their site, have numerous analytics tools at their disposal, and their teams have dabbled in A/B testing. But many still struggle to make sense of the abundance of data and choose the right path forward. They struggle to move from data-informed to data-backed.

Data-Informed vs. Data-Backed

graph showing similarities and differences between data-informed and data-backed

Though the language is similar, we make a crucial distinction between data-informed and data-backed.

Many organizations aspire to be data-backed, but they often fall short by primarily relying on generative research methods.

Generative methods are great for understanding what’s happening on a website and forming hypotheses about what would work better.

Generative research methods include things like:

  • Heatmap analysis
  • Surveys
  • Data analysis
  • Observational analysis
  • Open card sorting
  • Reviews theming
  • Social listening

Alternatively, evaluative research can be used to substantiate ideas with evidence. Evaluative research methods include things like:

  • A/B testing
  • First-click testing
  • Comparison testing
  • Tree testing

When we ask so-called “data-driven” companies what data they use, they typically list a number of generative research methods. Only about 25% of them list any evaluative research methods and rarely do they go beyond A/B testing.

infographic showing other types of validation

Generative research, those methods on the left, are great for understanding what’s happening on a website and forming hypotheses about what would work better.

But in order to move forward with confidence that your solutions will actually work, you need the methods on the right. Evaluative research is how we move from data-informed to data-backed.

Enjoying this article?

Subscribe to our newsletter, Good Question, to get insights like this sent straight to your inbox every week.

A/B Testing is Evaluative, But It’s Not Perfect

Not every data-informed decision is a good one. Those with experience in the industry tend to estimate that only about 10-20% of changes to a website move the needle on its KPIs. That means up to 90% of the work your devs do could be wasted on initiatives that don’t really matter (to your bosses or your users).

That’s why so many practitioners who understand the value of A/B testing will tell you to “test everything.” It’s one of the only ways we know, with confidence, what changes are actually making a difference.

While A/B testing is incredible in its fullest application, the realities can be disillusioning. Organizations that utilize it often stumble upon obstacles that impede their success.

The Challenges of A/B testing

These barriers include capabilities, time-to-value, velocity, and authority.

  • Capabilities — We hear from teams all the time who have a skilled marketer leading optimization, but their ambitions can be cut short unless they have access to the many skilled disciplines needed to adequately test: data analysis, development, design, etc.
  • Time-to-value — Depending on your traffic levels and goals, tests sometimes take longer to complete than teams have the appetite for
  • Velocity — Whether due to the complexity of your analysis or other barriers, even large teams struggle with velocity, getting only a few tests per quarter (or even per year) completed
  • Authority — We often hear from teams that struggle to find success because they never got full authority to test. When leaders of testing programs aren’t given full authority to test on a website, it can lead to a lack of progress, resulting in halted initiatives, false starts, and wasted efforts when momentum is upset by organizational politics.

What’s more, these challenges are not reserved for the small or just-starting-out brands. Even companies who test at scale struggle to achieve confidence at speed.

a/b testing sample

The Alternative to A/B Testing

We love A/B testing. We are huge advocates for having it in your toolkit, and we consider it the gold standard for data-backed decision making. But the reality is that A/B testing should be just one tool in your toolkit, and depending on your team’s culture and capabilities, the “test everything” approach may be just as much a hindrance to growth as testing nothing at all.

That’s why we propose another path forward for data-backed decision making. We want to arm you with alternatives to A/B testing that are faster, leaner, and ultimately, perhaps, more satisfying.

The approach that is right for you will vary based on your resources, traffic levels, and company culture, but let’s take a look at some alternatives to the “A/B test everything” approach.

Rapid Testing: Efficient Evaluative Research Methods For Data-Backed Companies

Rapid Prototyping

What can you do instead? The practice of making small, iterative bets through rapid testing and prototyping presents a pragmatic supplement to A/B testing. I’d go so far as to say it is a stand-out avenue for ushering in this new era of optimization.

Rapid prototyping, borrowed from the industrial design discipline, involves creating low-fidelity models to test and gain early feedback. This approach allows for quicker decision-making and validation through all kinds of user input.

It embraces the philosophy of “test early, test often,” enabling organizations to pivot swiftly based on user insights.

You can validate your prototypes with a variety of tests that can be grouped into two categories: task completion analysis and sentiment analysis

Task Completion Analysis

Task completion analysis allows us to quickly test new ideas to understand time-on-task and success rates. First-click testing and tree testing fall into this category of rapid testing.

First-Click Testing

What it is: Test what a user would click on first in order to complete their task.

What it’s great for: This is great for analyzing if your site delivers a clear path to purchase.

first-click testing rapid validation sample

Tree Testing

What it is: Tree testing presents users with accordions of your site’s navigation (without design) and asks them to look for something specific.

What it’s great for: It helps you understand if users can find key items or resources in your navigation.

tree testing sample

Sentiment Analysis

Sentiment analysis lets us preview how users might respond and react to a treatment. Preference tests, 5-second tests, and design surveys fall into this category of rapid testing.

Preference Testing

What it is: Preference testing takes multiple designs or pieces of copy and presents them to the tester by asking, “which one of these do you prefer?” or “which one of these do you think is more effective?”

What it’s great for: This is ideal when you have multiple options and want some quick feedback on what to implement.

preference testing rapid validation sample

5-Second Testing

What it is: The 5-second test shows the user an image for five seconds, takes it down, and then asks them what they remember seeing.

What it’s great for: This is best for testing aspects of a page layout for first impressions or what stands out visually.

5-second testing rapid validation sample

Design Surveys

What it is: Design surveys collect qualitative feedback on wireframes or mockups.

What it’s great for: Validate your designs before investing in development to implement them on your site.

design survey sample

A great experimentation program will be custom to your organization’s resources, users, goals, and needs as an organization. But, I can tell you with certainty that including rapid testing in your strategy will keep you lean and move you closer to data-backed.

It could be the secret to leveling up against your competition: Don’t A/B test everything. Instead, validate your ideas with rapid testing.

A/B Testing Is Just One Tool in Your Optimization Toolkit

The difficulties of A/B testing highlight a need for a multifaceted approach to optimization.

Once regarded as the ultimate solution, A/B testing faces challenges in practice due to organizational barriers.

As companies strive to improve their decision-making processes, alternatives such as rapid validation and prototyping, purpose-driven decision-making, and customer insight analysis emerge as efficient, viable supplements.

By embracing these strategies, organizations can overcome the pitfalls of testing fatigue and usher in a new era of data-driven efficiency. If you want to talk through strategies for your company’s unique situation, contact us.

Hundreds of millions in revenue generated with our strategic optimization programs.

But don’t take our word for it. Hear about the amazing results from 15+ years in business, straight from the source.

SEE HOW
Opting In To Optimization
Natalie Thomas headshot

About the Author

Natalie Thomas

Natalie Thomas is the Director of Digital Experience & UX Strategy at The Good. She works alongside ecommerce and product marketing leaders every day to produce sustainable, long term growth strategies.