Drive and Convert (Ep. 108): The Five Factors of Digital Success

In this episode, we discuss why traditional benchmarking doesn’t work and introduce the 5-Factors Scorecard™, a better way to measure digital success.

Listen to this episode:

About This Episode:

Anyone who guarantees they can increase your conversion rates with just on-site optimization alone is either lucky or lying. 

Many factors influence conversion rates. In this episode, we discuss why benchmarking doesn’t work and introduce a better way to measure optimization efforts.

After conducting a study of hundreds of optimization experts to uncover the key factors that influence the success of digital products, Jon and his team at The Good identified five competencies that set high-performance teams apart.

In fact, teams that excel in these five Factors are 60% more likely to meet their annual performance targets and twice as likely to rank “excellent” in customer satisfaction.

Listen to the full episode to learn;

  • Why traditional benchmarking is ineffective.
  • Why the 5-Factors Scorecard™ is a better way to measure success.
  • How the 5-Factors Scorecard™ can help you decide where to invest to improve your digital experience and determine what steps you need to take to grow your impact in your organization.

To get your scorecard, go to https://thegood.com/5factors

If you have questions, ideas, or feedback to share, connect with us on LinkedIn. We’re Jon MacDonald and Ryan Garrow.

Subscribe To The Show:

Episode Transcript:

Announcer:
You’re listening to Drive and Convert, a podcast about helping online brands to build a better e-commerce growth engine, with Jon MacDonald and Ryan Garrow.

Ryan:
All right, Jon, I have known you a long time. For sure long enough to know that the uncomfortable truth about digital optimization is that anyone that can guarantee they’ll increase your conversion rate with just onsite optimization alone is either lucky or lying, but probably just lying. They have some case studies somewhere where they did something dumb, they just focused on that, and now they’re just lying to us.

Jon:
Fair enough.

Ryan:
But I know that there are just tons of factors that influence conversion rates and some of those, as unfortunate as I think this is sometimes, some of those are within the control of the conversion rate optimizer and some of those are outside the control of CRO. And essentially, onsite factors are only really half of the equation. Like we’ve recently talked about all the psychology that goes into these things, but beyond that, you’ve got to be able to do more and see more than just changing some things on site. You proposed this topic, so I assume that you would agree with me.

Jon:
I would definitely agree with you.

Ryan:
Okay. I love it when I can get Jon to agree with me early on, because that makes everything else easier. Obviously, we pull benchmarks around the internet and see what is good, what is bad, and this is what you need to be trending, this is what you think is good. But are all comparisons just bad when you’re looking at onsite stuff?

Jon:
Well, in part, yes. That’s the short answer. But this doesn’t mean you can’t look at another company’s conversion rate to measure our own success, I just think that the effort is really futile. Are all comparisons bad? No, definitely not. But you really do need to look at your own specific users and compare yourself on multiple metrics, not just the one metric that you are going to compare, and you need to do that in a much more nuanced fashion. Ideally, you’re doing it against a recipe for high performance. What has worked for others? So, what’s that recipe? Well, at The Good, we figured it out by asking hundreds of high performing digital optimizers. That’s why I wanted to talk about that today, I want to talk briefly about why benchmarking doesn’t work, and then introduce a better way to measure the efficacy of your optimization effort.

Ryan:
I love it. Because it’s relatively easy using Google to find some benchmark to say, “Everybody’s at this, we need to go beyond that.” I believe you kind of have to start there, right?

Jon:
For sure.

Ryan:
If you got a website, you’re like, we to improve it, what’s good? You’re like, I don’t know, let’s go to Google. What’s a good conversion rate for somebody selling T-shirts? It’s probably a lie, but at least we have a number to start with.

Jon:
Well, fair enough.

Ryan:
Okay, so why is that ineffective?

Jon:
Yeah. Because look, you don’t know what your goals should be, so you’re looking at your competitors. And I think that’s the first issue. You should really try to understand what your goals should be, not what your competitors were. And I feel like it’s akin to looking at the crowd because you aren’t sure how to behave. You’re looking around the room like, “What’s everyone else doing? Should I do this?” And then you’re at that fancy dinner table and you’re like, “I’m going to sit upright, elbows off the table.” Now I’m all stiff I’m just like, “What do I do next?” Right?

Ryan:
Yeah. Which fork do I use?

Jon:
Yeah, exactly. You feel the anchor to want to anchor ourselves to something, and so that’s why there’s this human need for benchmarks. But unfortunately, comparing your conversion rate to other companies or even your industry as a whole is just practically meaningless. We’ve done a whole episode on this. It really comes down to the fact that your competitor data can be unreliable. It can be inaccurate. It’s often simply made up. I hate to say it, it’s true, I see it all the time. Even with really niche down industry data, it still contains way too much noise. There’s just so factors that go into that data, your products, your market conditions, your pricing strategies, channel mix. Maybe even your customer groups are all just too different to control against even what you might think is a true competitor.
So really, there’s just so many unfortunate things to think about here, but they’re too simplistic. Really, benchmarks are just too simplistic to be used. And if you’re looking at only one metric, it’s even that much worse. You’re just making it even that much more simplistic. And let’s be honest, what trumps all of this for me is the fact that even if you match that competitor’s rate, it’s not like you are going to just stop optimizing. You’re always going to want that number to improve. So, does it really matter what anyone else is doing? I don’t think it does. I think as long as you can show you’re improving, that’s what’s going to matter.

Ryan:
No, I agree with that, because it’s also the plateaus. Like, great, you got your conversion rate to go up, that allows you to push harder in marketing and spend more and get a lower conversion rate. Which means you drop back down and then you optimize again, you get back up.

Jon:
Yep.

Ryan:
Yes. Okay. You never leave us hanging, so I bet you’re going to tell us there’s a better way to measure success. Just hypothesizing here. You can’t look at your competitors, what the heck do we do? Where do we start then?

Jon:
Yeah, indeed I am here. All right, Look, instead of sticking to really traditional website benchmarks, I think you should explore a better way to score your performance. And I think it all comes down to five different factors. We’ve coined it the Five Factors Scorecard, at The Good. Because there’s five factors, we’re not that creative. It’s called Five Factors, let’s just do it. What we did is we conducted a study of hundreds of optimization experts, and our goal was to uncover the key factors that influenced the success of digital products. Now, that’s e-commerce, it’s SaaS, it’s anything that would be considered a digital product.
What we’ve done is, based on their responses, we looked at all of this and what we found was there were five competencies that set the high performing teams apart. That could be led to think, well, if you can score highly in these five areas, that you would excel as well. And in fact, what we found was teams that excel in these five factors are 60% more likely to meet their annual performance targets. So, you’re going to have a pretty good head start. And you’re twice as likely to rank excellent in customer satisfaction. Not only will you look better to your boss, and your board, or whoever you’re reporting to, because you’re going to have 60% more chance than the next person if you hit these five areas. But your customers are likely to be twice as happy.

Ryan:
That’s impressive. All right. Give us these five points, Jon, don’t quit holding it up. Let’s go.

Jon:
Maybe I’ll sell them a little more first.

Ryan:
Yeah, yeah. Let’s [inaudible 00:07:22] play.

Jon:
All right, here’s the five factors then. Data foundations is one, that’s the goals, the ownership, and having good data to really form a backbone of all of this. All starts with data. I’ve said that 100 times. User-centered approach is the second one. This is really just, it’s a comprehensive roadmap. And really what you want to do here is be taking the user-centered approach through your entire digital journey. The third is resourcing. There’s resources that you need to support adequate capabilities and a pace. You got to keep this moving, it’s not a one and done type of optimization.
Toolkit is a fourth. That is a variety of tools. You’re going to be looking at planning, measurement, what protocols you use, et cetera. And then the fifth is impact and buy-in. These are tools and practices that will increase develops, perhaps perceived value within your organization. And also, do you have the support from the leadership? Really, you can get in your own way if you don’t. The goal with all five of these is give you a few questions you can answer, we score you on those, and then you understand where you need to invest to improve.

Ryan:
Yeah, so each one is scored. Are they scored one to 100, and then if you’re 50 and everything else is 75, you invest in the 51st?

Jon:
Right. Exactly. Wherever the low scores are is where you want to improve. And we try to make it as simple as possible. Once you take your Five Factor Scorecard, we’re going to guide you along that journey of what you should be doing next, and give you some idea there.

Ryan:
And it’s not like any one of the five factors is more important than the others. They’re all, what you would say is equally important, 20% of the score, essentially.

Jon:
What we found was that the teams that excel, excel in all five of these. Now, they might excel in some other areas as well, or they may be deficient in one or two of these areas and not ranked as highly. But what we found was the high ranking brands, all of them had above average scores in all five of these areas in the survey.

Ryan:
Okay, great. All right, let’s go a little more into each one of these. You listed data foundations first, even though that may not be the most important. But tell us a little more about, what are the options within data foundations?

Jon:
Right. Yeah. None of these are more important than the other. I do think data usually comes first when I talk about things like this because it is the foundation. No optimization team is complete without access to a solid layer of data. Yet, there’s so many organizations that are operating in less than ideal conditions when it comes to data. What you want to be doing here is building a healthy data flow that you can trust, and it should be your first step to building a strong optimization program. Crap in, crap out. To put it PG. So, really got to have that solid foundation to really just align your strategies and actual user behaviors and needs.
What you would want to do as a company and what the consumers want to do, how do you bring those two together? The data’s going to guide the way there. Or at least will tell you, hey, you’re not in alignment. And then you need to reassess if you want to get back into alignment or you need to find new customers. I think the key here is to ensure that your data is accurate and accessible to everybody, and use it to make informed decisions.

Ryan:
There’s a lot that can go into just the word data. And so, is Google Analytics working properly going to be sufficient for most companies, or are there things they need to be adding on to control and have better data?

Jon:
I would say that this is a tool in a toolbox, and GA is one of those tools. User testing, consumer insights and surveys, customer sentiment. Are you doing a score to tell you how much people like your brand? What’s the one question everybody always asks, well, would you refer our brand? And I think that’s fine, but again, that’s only one question in the toolbox.

Ryan:
Got it. Just seeing web traffic alone would not be considered CRO data. That is just a small piece of what your… Let’s call it a data stack would look like to be able to execute some CRO.

Jon:
Correct. Correct.

Ryan:
I would think reviews could be a piece of that. What are your reviews looking like and how are you monitoring that? I think NPS scores are valuable no matter what for anybody.

Jon:
Right. They have value, but not alone. If you just try to optimize for that one question, I promise you there’s ways to hack that to get that question up. But it matters in context with everything else.

Ryan:
Would you say at this stage, let’s say it’s smaller businesses that are just starting to get going with conversion rate optimization. Would you say that there tends to be one piece of the data stack that is missing most often when you talk to these brands?

Jon:
Yeah, the fact that they’re not talking to consumers. So many brands talk to consumers before they start their company because they hear how important that is. Validate the idea, validate the idea. But once they get the idea launched, they seemingly just like, “I know everything. I’m going to stop talking to consumers.” That’s really the first big item that we find is often missing at that stage. And it’s this weird inverse curve. Because what happens is everybody talks to customers before they launch, then they go into this trough of not talking to them. Then they talk to them again because they’re like, oh, we want to scale, we need to know feedback. And they realize they should have been talking to customers all along. Then they grow and then they become this enterprise upper market brand, and again, they just get so much politics and process and everything in the way that they forget to talk to the customer that often. And then it drops down again. So really, there’s those two low points that I think digital journey optimization could really come in and help solve.

Announcer:
You’re listening to Drive and Convert, a podcast focused on e-commerce growth. Your hosts are Jon McDonald, founder of The Good, a conversion rate optimization agency that works with e-commerce brands to help convert more of their visitors into buyers. And Ryan Garrow of Logical Position, a digital marketing agency offering pay-per-click management, search engine optimization, and website design services to brands of all sizes. If you find this podcast helpful, please help us out by leaving a review on Apple Podcasts and sharing it with a friend or colleague. Thank you.

Ryan:
All right, great. And that takes us really into the second point you’ve got listed here, is user-centered approach. Which, if you’re communicating with your users, makes it easier to focus and center on them. But what goes into that piece in this Five Factors?

Jon:
What we’re really talking about here is that the successful digital leaders with optimization programs, they’re placing the needs, preferences, the abilities of users at the forefront of their digital journey. So they’re really working to understand user behaviors and the goals behind those digital experiences to increase engagement, and often conversions. Really, the goal here is to prioritize users to create a product or service that’s intuitive for them, and also enjoyable for them to use. So if you put customers at the front, you can’t go wrong, in most cases.

Ryan:
Got it. If you’ve created a business to solve a user problem, hopefully. But then you have to really say, okay, there’s that problem, but are there better ways, or are your customers or users seeing that you may have issues within some of the solution? There’s that. Is user focus more along the lines of I polled them, got some feedback. What’s the next step in that user-centered approach?

Jon:
Right. I think that’s it, I think it’s a matter of talking to consumers and consuming that data to help you understand what they want. Whether or not you do it is up to the brand. The consumer’s not always right, but I do think that often the consumer is going to be very clear about what they want, and what they feel is wrong. Meaning, I can’t figure out how to use this. Or, this is just a really frustrating experience. Or, I really wanted to book my flight and move on with the day. But yesterday I tried to book using a companion fare, and I was reminded why I let my companion fare expire every year, because it’s a pain in the butt.

Ryan:
You have to call and it’s just…

Jon:
Yeah.

Ryan:
Limitations.

Jon:
The airlines have tried to make it easier to use online, but the fact is, if I want to research multiple flights, I have to go through this 10 click process to be able to do it, and then I can finally look. And heaven forbid I make an error at any point in that process because I got to start over. Again, that’s not a very user-friendly approach. Their approach was, oh, we’ll make this easy for consumers to use the companion certificate online, so they don’t have to call in. I’m sure that was their hypothesis. And they probably haven’t done a whole lot of research on it because it’s probably, honestly, not a high revenue driver for them. In fact, it costs them money to use that.

Ryan:
Well, I see this often where brands lose the focus of focusing on the users when things start getting rocky and they’re like, “Hey, we got to get more stability in this.” And subscriptions seemed to be what a lot of brands like, “Oh, I don’t know if this came up in some conference that everybody went to, but it was like the solution to all of our problems is get everybody to subscribe to a monthly thing and that’s all we need to do. And ram it down their throats.” And so you get on some of these sites, I’m like, I can’t buy your product. It feels like I can’t buy your product unless I get a subscription. And I’ve never bought from you, so what if I don’t want to subscribe yet? Let me buy one time. How do I know I like that flavor, but you’re making me subscribe to it monthly? What if I want to switch? It doesn’t seem like it’s going to be easy.
And that’s where it becomes a business saying, “We need subscribe because that’s guaranteed revenue every month so that we can survive.” Rather than, “How do we get these consumers that have never bought from us in the past to make it easier to get it for the first time, then send them into a subscription?”

Jon:
Well, I think that all came from, over COVID investors got involved in e-commerce a lot. There were these companies buying up e-commerce brands, that were aggregating the brands thinking, oh, we’ll take care of all the back office and the warehouse. We’ll share fulfillment, all these things, and it’ll be wonderful. Unfortunately, when the numbers dwindled, they got high pressure to make changes. And a lot of those changes were in the name of revenue, not in the name of consumers. So that’s partly why that happened, I think.

Ryan:
Make sure you’re obviously putting the customer first. You can’t ignore brand revenue and profit, because that’s what allows the customers to get focused. But again, making sure that customer’s not ignored in that, for sure. But then it comes into resourcing. How do you decide where to push and pull resources from and make that work?

Jon:
I think most people when they think about optimization, they’re thinking about onsite experimentation. Can’t tell you the number of times I’ve said optimization and someone goes, “You mean AB testing, right?” That’s the only thing people think of. But in reality it’s a much, much, much bigger tent. Successful optimization programs require multiple people with diverse experiences. And I think that brands forget this. And you’re moving through these phases of optimization, but that requires research, data analytics, design, engineering. I don’t know, I could go on, but there’s a lot of disciplines that are involved here. So you have to have the right resources, and often that resourcing is going to come down to assembling or outsourcing perhaps a team with these varied skillsets to address different aspects of the optimization process. Then adequate support will allow you to move through that process smoothly, leading to faster iterations and better outcomes.

Ryan:
Got it. So just running an AB test of positioning something on a page, great, that’s one resource. But it’s analyzing that data that is a separate resource that you probably don’t pay attention to. I probably wouldn’t know how to look through the report on an AB test to just help decide, I don’t know. This looks good, but making sure that you have the brain power coming from somewhere to look at that and tell you what’s working. Then implementing that.

Jon:
Unfortunately, a lot of folks rely on the tools to tell. Almost every testing tool, even Google Optimize did this when that was a thing. It would tell you, here’s the percentage that we agree with this. Unfortunately, people would see it above 50% and they would say, “Oh, we have a winner.” It’s more than 50%, sure. Well, when it’s less than 99, that can fluctuate dramatically.

Ryan:
51% sure is not necessarily something I’m betting on.

Jon:
No, because in an hour it could be 30% sure. It just all depends on traffic volume and how many people are seeing it, and the conversions that are happening. But I think this is where the next thing comes in, which is building a toolkit. Because you have to have three critical tools, and when we’re talking about tools, and we’re not talking just software, but that is a component. I think we’re talking prioritization, research, and experimentation. And software can help out with all three of these. But really, what you’re looking for here is having a framework, a prioritization framework to keep you focused on the highest impact initiatives.
If you get off course there and you start just letting the hippo in the room say, “Oh, I think we should test this, I think we should test that.” Or you get one piece of client feedback and say, “I think our customers want X, Y, or Z,” you’re going to get off course. Doesn’t mean you shouldn’t listen to your customers, you definitely should. And maybe that insight that hippo in the room had, great. But you really want to make use of evaluative research and offsite experimentation techniques to validate things first. So AB testing, great for testing the minutia if you have enough traffic. Not good for huge changes, it’s not good for low traffic sites. You really have to be a going concern that is fairly large to make it worthwhile.

Ryan:
Got it. So toolkits, it’s a large group of stuff, I would say. It’s not just, oh, analytics and customer surveys. That covers a lot of humans in the organization to get all that stuff in there.

Jon:
Yeah, definitely can. Which really takes you to the next one, which is impact and buy-in. You’re talking about humans, it’s a great segue because most optimization teams that create the biggest impact, we found they have strong buy-in from their leadership. And this does really two things for the company. It delivers sufficient budget, because if you have an ally who’s going to help you and believes in this, they’re going to give you budget and people, which are two resources that you need to be able to have an impact. And then you’re going to create a culture that values experimentation and incremental changes. If you have a leader coming in who says, “We need to double overnight,” you’re going to do some drastic things that are probably going to push consumers out the door. Because you’re going to be doing whatever is needed to get revenue, but that’s going to have a cost later on. So strong leadership buy-in creates a culture that just values optimization, number one. And it ensures sufficient resources are allocated to those optimization efforts, number two. So buy-in is the key ingredient to driving this meaningful impact.

Ryan:
A lot of the organizations we work with, one of the key buy-ins that we see for a lot of this ends up being the CFO. At a good organization with a strong CFO. If that CFO believes in it, man, magic can happen when a CFO believes in testing, measuring, improving processes in, investing in those. Because I think too often I’ve had CFO’s involved, we’ve been involved with, just all they are is cut, cut, cut, control, control, control, control. And it has to be bottom line profit this month or we’re in trouble. And there are periods of time where it makes sense for an organization. But if a CFO is like, “Hey, we’re looking five, six years down the road, we need to improve now so that works,” man, that can be so powerful. When that vision, it doesn’t have to come from the CFO, but if the whole leadership team is really thinking about where are we going to be in five years from now, investing in CRO becomes a no-brainer.

Jon:
And I think you said it right in the sense that, as a brand, you have to be in the right spot to have that mindset. If you are just struggling to make ends meet, then there’s probably other things you should worry about first. But once you’ve gotten through that initial gauntlet and you’re scaling, and you’re either making a choice to scale unprofitably or scale at a really low margin and maybe break even, that’s fine if you choose to do that, and that’s what everybody decides is best for you. But again, you have to really have buy-in from that leadership, and a CFO will help you do that, for sure.

Ryan:
Okay. Well, at the beginning of this and what we got down these rabbit holes, you’ve mentioned the Five Factor Scorecard, and that’s the tool that you’ve created to help give us a vision of where we’re at and where to invest. Right?

Jon:
Exactly. There’s five rabbit holes, as you mentioned, and we’ve gone down all of them today. And you really want to make sure that you measure yourself in each of these. The first step I think is just understanding where you stand in each of these areas. Where are you at today? And I want to stress that you’re not looking to score perfectly in all five of these. Nobody has. We ran this against hundreds of brands, not one person scored perfectly, and that’s okay. That’s actually good, because everybody always has somewhere to improve. I don’t know anyone who has a perfect company, even the top performers.

Ryan:
If you do, you’re not listening to us, so they know that.

Jon:
Yeah, fair enough. You already drive enough traffic and convert all of it. So hey, it must be okay, right? Okay, here’s what the Five Factor Scorecard does, it automatically measures you against your highest performing teams to expose what stands between you and digital excellence. You’re going to answer these questions, it’s going to measure you against that baseline average that we put together based on surveying hundreds of brands. And then with that Five Factor Scorecard, you’re going to know where to invest, how to improve your digital experience, and what steps you need to make to grow and impact your organization and your digital journey.

Ryan:
Got it. Essentially, you’re going to give us a benchmark that we can go by without having to go Google something and pretend that we know what the competitors are doing.

Jon:
Right. And this is where a benchmark without context across all five of these areas is not helpful. And that’s where, when we started, if we were looking at just one metric, then you’re missing all the context. And that’s why these five play together. And we’ve said it a couple times that none of these five individual factors are weighted heavier than the other. And I think that it’s really important just to have a good understanding of, where are we at, where do we need to improve? And have that context across the entire organization as opposed to just looking at one or two metrics and trying to compare yourself to an individual competitor.

Ryan:
Got it. So you’re not going to get, from this benchmark, a conversion rate that you want to shoot for. Which doesn’t matter anyway, that’s a bad metric.

Jon:
If I ever have that tool come out, please slap me and we will wake me up, because I should not be ever doing that. I’ve stood on that soapbox for 15 years. I fortunately am on that soapbox until I die. I believe it. You should not be comparing yourself to your competitors.

Ryan:
Nope. The best conversion rate’s always improving. You’ve beat that into my head. I know this.

Jon:
I love that.

Ryan:
I say it in my sleep. Where do we go to get this scorecard and start getting some actual benchmarks we can use?

Jon:
Yeah, thanks for asking. Okay, so go to thegood.com/5factors, so thegood.com/5, and then factors, F-A-C-T-O-R-S. Also, I think if you just go to thegood.com, click the blue button in the top right-hand corner, we’ll take you there as well.

Ryan:
Excited. All right, Jon, we’ve got five factors we know we need to be investing in from a CRO standpoint, they’re all weighted equally. We got to get our data foundation in place. We’ve got to focus on the users, make them first. Got to be able to pull the right resources into this plan and purpose for what we’re doing. You’ve got to have the right toolkit, you can’t do this without the right tools. Wrong tools makes it difficult. And then you’ve got to get your leadership to buy-in and agree on what the impact is going to be.

Jon:
Perfect.

Ryan:
Anything else we’ve left out?

Jon:
No, but I am going to clip that out of this video and I am going to post that on the landing page because you said it extremely well. Thank you.

Ryan:
Thank you, Jon.

Jon:
Ryan Garrow, the new poster boy for Five Factors. All right. Thank you, Ryan. Appreciate it.

Ryan:
Thank you.

Announcer:
Thanks for listening to Drive and Convert with Jon MacDonald and Ryan Garrow. To keep up to date with new episodes, you can subscribe at driveandconvert.com.

Jon MacDonald smiling at the camera for The Good

About the Author

Jon MacDonald

Jon MacDonald is founder and President of The Good, a digital experience optimization firm that has achieved results for some of the largest companies including Adobe, Nike, Xerox, Verizon, Intel and more. Jon regularly contributes to publications like Entrepreneur and Inc.