
How Kalah Arsenault’s Team Stood Up An A/B Testing Program & Doubled Volume With A New Prioritization Model
Optimization isn’t a one-size-fits-all practice. Each organization has unique data, needs, and goals, on top of the always-evolving technology stack that supports experimentation.
So, as a leader, it’s important to adapt. Kalah Arsenault knows this well.
Over the course of her career, she’s been tasked with everything from turning data into actionable insights and advocating for data-driven analysis to building experimentation programs.
Currently, she leads the marketing optimization team at Autodesk, the global leader in 3D design, engineering, and entertainment software.
We had the chance to sit down with her and get the inside scoop on:
- Standing up an A/B testing program
- A simple prioritization model making an impact
- Measuring and circulating optimization learnings
Marketing optimization for a leading software company
As the marketing optimization team lead, Kalah digs into all the nooks and crannies of the company’s marketing efforts to make it more effective and efficient.
The marketing optimization team at Autodesk sits on the operations team at the intersection between marketing operations and technology. Partnering with marketing teams to improve campaign effectiveness, Kalah and the marketing optimization team bridge the gap between data, marketing know-how, and testing expertise.
When shakeups a few years ago halted all A/B testing on the Autodesk website, Kalah was eager to partner with the website team to re-enable experimentation. A self-proclaimed marketing, analytics & optimization enthusiast, Kalah brings a consistent data-backed ethos to her work. And her background tee’d her up for success. Kalah jump-started her professional life in advertising and ecommerce. The experience working in stakeholder-facing roles gave her a unique ability to turn data into stories and prove the value of iterating your way to success.
Standing up an A/B testing program
The challenge was clear. Without an experimentation program in place, the team was left without the data needed to fuel good decision-making.
“The data will tell you what is the right choice and it takes decision-making out of the process,” she said when asked how data plays a role in her decision-making. It can even go so far as to be said that they don’t just affect the process, they are the process. “Experimentation and data can be the decision-making process.”
So, it was crucial to get the A/B testing program back on its feet in order to bring that clarity to the work she was doing day-to-day.
To start, Kalah and her team put their experience into practice, creating an A/B testing roadmap. This was a crucial step, requiring them to define goals, align with stakeholders, and assess priorities and risks of optimization. Because of a new organizational structure, on top of the complexity of rebuilding the A/B testing program, there was an added obstacle to working across different marketing teams.
The optimization and web teams worked together to establish clear parameters, agreements, and definitions of what could or could not be tested. There is now a huge, pre-approved sandbox to play in, allowing optimizers the chance to find iterations that improve UX and marketing KPIs.
Whether you’re a researcher, an analyst, a marketer, or an optimization specialist, a well-made roadmap connects you with the clear steps needed to begin experimenting.
For Kalah, this meant:
- Identifying objectives for the testing program
- Establishing marketing and website challenges
- Isolating testing opportunities
- Formulating testing hypotheses
- Prioritizing testing opportunities
With frameworks in place, they were ready to get back to work.
While other optimization leaders can follow a similar strategy of aligning with stakeholders and building a roadmap, standing up an A/B testing program is no small feat. So, if you don’t have the resources or a dedicated team like Autodesk, she has some advice.
“What I primarily suggest is hiring someone who specializes in the practice. I think the expertise to identify optimization opportunities, design the tests, see it through implementation, measure the results, and provide recommendations and next steps is incredibly impactful.”
And while there are some savvy marketers that can do this, she emphasizes that “it's a separate skill set and expertise.” So whether you hire for that as a full-time role or you look to agencies to bring that expertise, Kalah strongly recommends companies consider experts to lead the charge.
For example, “at a high level, a test may show one version outperforming another,” she says. “But digging deeper often reveals different results by segment, whether by job profile, country, or industry. We aim to look beyond primary KPIs to fully understand what’s driving the outcome.” That level of nuance is hard to find in a busy marketer, so it’s best to have dedicated optimizers around who can take the time to know and understand audiences.
Enjoying this article?
Subscribe to our newsletter, Good Question, to get insights like this sent straight to your inbox every week.
A prioritization model to drive velocity
With the A/B testing program back up and running, Kalah and her team had their plates full finding efficiencies and improvements across Autodesk’s marketing efforts.
Not only were there opportunities identified in their research, but teams across the organization were submitting requests and ideas for their consideration.
The list was long. The optimization team wasn’t sure what the “most important first thing” to work on was, and marketing stakeholders didn’t understand why their projects weren’t top of the list for testing. There was an opportunity to clarify and get more done quickly.
The solution? A prioritization model aimed at:
- Increase testing volume
- Aligning teams
- Saving time
While lots of testing folks would hear “prioritization model” and go straight to the mathematical elements, Kalah needed a model that was simple, easy to calculate, and transparent for all parties.
Kalah and her team built out an auto-calculated prioritization model as part of their optimization requests intake process. It involves three elements:
- Business impact: Measured based on whether the request aligns with the marketing plan, which is agreed upon by everyone from CMO to entry-level marketing team member.
- Level of effort: Internal criteria that identify a higher or lower level of effort.
- Urgency: Assess the request with questions like: Does it need to be executed immediately? Does it impact a larger project immediately? Does this effort have backing from a senior leader in marketing?
The intake process asks questions related to the criteria mentioned, and then logic set up in Asana auto-calculates the prioritization of the experiment or optimization. “This is what saves us time and energy,” she says. It eliminates looping conversations and time to manually prioritize things amongst the team.
Kalah emphasizes the power of this setup. “We don't do mathematical calculations to assess the level of business impact or length of time to reach statistical significance. That's too resource-intensive, and we'd be spending all our time assessing and prioritizing. With our automated prioritization model, we can spend our time on launching and analyzing tests and making business impact.”
And it worked.
“We were able to double the amount of tests our team took on within one year. So from this compared to last year we doubled the volume of testing with a new operating and prioritization model.”
Measuring marketing optimization success
The volume of tests is just one of the key metrics Kalah identified for measuring her team’s success.
It can be tough to find just one KPI to prove the value of optimization, given the nature of working across teams, products, and audiences. So, instead of focusing on 1:1 measurement, they look at a variety of metrics, including:
- Volume of tests
- Volume of analyses
- Customer satisfaction score
- How many marketers are seeing/learning from the findings
In the end, her team’s goal is to look at how marketing campaigns are performing and then give advice on how to make them better. So, while each test or optimization has its own KPI related to growth, as a team they are measured more holistically.
With prioritization and test volume locked down, she is ready to move the needle on insights shared.
“I'd really love to put more energy towards amplifying the impact of each test and getting the findings out to as many marketing teams as possible. We've already seen this working through a newsletter that's sharing our testing results and analysis work. We've also been hosting quarterly brown bag style meetings with the most universally applicable test results that marketers could implement themselves.”
This year, Kalah is also hoping to find new ways to turn insights into action. “I am also hoping to dive into data visualizations and figuring out how to make our findings more snackable and basically getting to a place where people want to read them and it's easy and enjoyable.”
These goals directly align with her team’s measurements for success. Other optimization leaders can take a page from Kalah’s playbook here by letting individual tests focus on the marketing metrics and determine departmental success based on insights, experiments, or other relevant measurements.
How can you replicate some of Kalah’s success?
Kalah’s advice to those new to optimization is simple yet impactful: start small, and stay curious. “Get to know your data, experiment with tools, and don’t be afraid to make tweaks,” she says. “You might be surprised at the impact even small changes can have.”
Her overarching message is one of optimism and opportunity. “Optimization is about evolving and improving—for your customers, your organization, and yourself,” she concludes.
Yet, good optimization leaders know that you can’t do it all alone. Internally, Kalah’s team employs a mix of full-time employees, contractors, and agency partners to meet the demands of scaling optimization efforts. “Contractors and agencies can help manage peaks in the workload,” she notes.
“I come from an agency background. I've always been a fan of working with full-time employees, but I realized as we're trying to scale and grow the amount of impact we're making as a team, it's really important to have contractors or agency partners to support higher demand and the peaks and valleys of work.”
By embracing a data-driven mindset, prioritizing strategically, and fostering cross-team collaboration, Kalah exemplifies what it means to lead impactful optimization efforts. If you need an expert partner to help manage a robust roadmap, get to know our Digital Experience Optimization Program™.
Enjoying this article?
Subscribe to our newsletter, Good Question, to get insights like this sent straight to your inbox every week.

About the Author
Caroline Appert
Caroline Appert is the Director of Marketing at The Good. She has proven success in crafting marketing strategies and executing revenue-boosting campaigns for companies in a diverse set of industries.