VWO Logo
Follow us and stay on top of everything CRO
Webinar

How to Balance Innovation and Optimization in your CRO Program

Duration - 45 minutes
Speaker
Sam Baker

Sam Baker

Owner

Key Takeaways

  • Constantly test your site traffic: This will help you gain insights from every user who visits your site, ensuring no potential data is lost.
  • Build a results repository: This can be referenced when new ideas are needed, providing data to support or refute the potential innovation.
  • Gain at least one customer insight from each test: This will help you understand your customers better and ensure you're not just viewing tests as wins or losses.
  • Develop a strategy for innovation: This includes aligning with business goals, creating a testing roadmap, analyzing and documenting results, and iterating on learnings. This will help you be prepared for any push for innovation and avoid panic-driven decisions.
  • Create a culture of constant innovation: By developing insight for every test and iterating on those test ideas, you can foster a culture of innovation that is ongoing and not driven by panic or sudden demands.

Summary of the session

The webinar, hosted by Romesh from VWO, featured Sam Baker, a CRO expert, discussing the challenges and strategies of balancing innovation and optimization in CRO programs. Sam started with a fictional story to illustrate the common conflict between innovators and CRO teams, highlighting the difficulties of testing multivariable, innovative ideas. She emphasized the importance of a consistent testing roadmap, conducting upfront user research, and breaking innovations into several tests.

Sam also addressed the tension between UX and CRO teams, and how to work together for innovation. She concluded with practical strategies to ensure innovative ideas have a sound analytics strategy, without bottlenecking the business.

Webinar Video

Webinar Deck

Top questions asked by the audience

  • My website isn't getting a lot of traffic. How can I use your 3rd way, practice breaking innovation into several tests without running long tests?

    Yeah. That's a great question and one that comes up quite often with the clients that I work with. I think the key here is to break your ideas down into bigger ideas. So maybe you need to combine, for ... example, the Afterpay and the sticky add to cart button so that there's a more obvious change, and then test the other 2 later. So the bigger the change the less time you'll need to see some sort of result. And then, of course, remember, this isn't the end of all of this testing strategy. You are then running the bigger product detail page test after that. So these many many tasks we'll call them are meant to help you inform whether or not that final test will be successful or not.

Transcription

Disclaimer- Please be aware that the content below is computer-generated, so kindly disregard any potential errors or shortcomings.

Romesh from VWO: Hi. I’m Romesh from VWO, and I work as a senior manager here. And I look after the US marketing. VWO regularly organizes webinars on various topics which can help marketers, product managers, and people from different backgrounds keep pace with the evolving ...
marketing world. Today, we are here with another interesting session on how to balance innovation and optimization in your CRO program.

We have Sam Baker, who’s the owner of Sam Baker Consulting. She is here to share her experience and insights on this topic. Sam has been in the CRO space for over seven years and has helped multiple companies deal with conversion-related issues. And this is one particular topic that she finds is common across the board. With this, I’ll hand over the session to Sam. Welcome Sam and take it away.

 

Sam Baker:

Hi, everyone. Thank you so much for being here. I’m very passionate about how to balance innovation and optimization in your CRO program. Before we get started, just a few housekeeping reminders.

You’re all on mute. If you haven’t noticed yet, you’ll be placed on mute throughout the entire webinar. If you have any technical questions, or if something isn’t working right for you, please respond in the chat and someone from the VWO team will help you out. If you have any questions, please send them via the chat as well. And at the end, we’ll have a Q&A session where I’ll answer all those questions.

And then if we have too many questions I will respond via email to anything that I haven’t been able to address at the end of this webinar. We do wanna make sure that we’re keeping it under an hour so that everyone can get back to their day and hopefully walk away with some great insights. So I am Sam Baker. I’m the owner of Sam Baker Consulting, and I’ve been in the CRO space for about seven years now and the e-commerce space for the past ten. However, this webinar is not focused just on e-commerce.

I do have a breadth of experience with other industries as well. I’ve worked with Under Armour, Abercrombie and Fitch, IHG, and Fidelity to help them grow their CRO and A/B testing programs. And my goal is really to help companies build their revenue through conversion rate optimization. So at the end of this webinar, I want you to be able to strike a balance between innovation and optimization.

Have a know why having a consistent testing roadmap will help you get to that innovation and CRO balance. How upfront user research can support that innovative process. And then what questions should you ask when you’re testing new innovative ideas to make sure you’re getting the most from the analysis? So why are we talking about innovation anyway? And this is a question I’ve been asking myself quite a bit in the last year.

And the reason for that is you know, we’ve seen so much change in every single industry in the past couple of years. And companies are being forced to innovate more than they ever have in the process. But I think that’s posed some problems for the CRO space. So I’ve been asking myself quite a bit how we balance that innovation and conversion rate optimization so that they can work together instead of against one another. Innovation is among the top five priorities for 96% of companies. However, 57% of companies say that they struggle to foster an internal culture of experimentation and innovation, and that’s why this topic is so important to me.

So I wanna start with a fictional story about innovation, but I am curious if anybody on the call has had any experience with something like this. There’s somebody in your organization who has a great innovative idea. So they start with this website page, and they’re like, I know how to make that better. So they change the main elements on the page and then they change the functionality of the sidebar. They completely get rid of the search bar, and then they start sending people to different parts of the website from that page that were never accessed before.

So now we have the original version and the new version, and I think we all know what happens now that an innovator comes to the CRO team and says, this is what I wanna do. How do we test it? And this is pretty much what the CRO team looks like. If you’ve ever been there, you can probably relate. That’s because there are three things you learn early as a conversion rate optimizer, and every experiment must only test a single variable.

That we must have enough site traffic to see significant changes within a reasonable amount of time and setting up a test must be technically possible. Three simple things. However, this is the original and this is the new. Now you’re faced with either helping that innovator test his idea or saying no. And I think there’s a big problem with the test-everything mentality here. And that innovative ideas tend to be multivariable, difficult to A/B test, and often lack a true control experience.

So as optimizers, are we just saying, no, we aren’t testing this? Or is there a way to use our effort to inspire innovation and ultimately analyze those innovative ideas in a data-driven way? I just wanna reiterate that trying to find meaning from a multi-variable low-traffic control list test is extremely frustrating. That’s why I think we need to make sure that we’re expanding CRO so that we’re able to help our partners be innovative and set the parameters around those test ideas. This is why the UX and CRO teams sometimes find themselves at odds because the CRO team is saying, 

“Hey. We need to be diligent. We need to use statistics-driven principles to analyze these ideas.” And the UX teams are saying, “Hey, you’re boxing me into a corner.” I wanna be innovative. I wanna help push this brand further forward in an innovative way that is what this company wants to do. How do we work together? The problem is that 82% of organizations run innovation in the same way they would go about achieving any incremental performance gain. I think this statistic was extremely telling for me, but not surprising. And the reason for everything that I just highlighted here with this fictional story is that the problems are really because we’re trying to run our tests, our innovation tests in the same way we would run any optimization.

So how do we make sure that this guy’s innovative idea has a sound analytics strategy to support it? And I’m going to give you three ways that you can make sure that you as a conversion rate optimizer are also able to test innovative ideas and move the business forward without bottlenecking the business. 

And the first step here is committing to a testing road map. 

Innovation should start with insights from the CRO program. And testing allows us to get to know our customers on a deeper level through data.

But we need to have a consistent testing roadmap to do that. So for consistent testing, we are building up our insights by the time we’re talking about innovating. It’s coming naturally to us because we have this backlog of insights that can help drive those ideas. So how do you do this? The first step is to test. We should be testing as much as possible.

I once heard somebody say that if you weren’t testing the traffic that’s on your site right now, you’re losing that traffic forever. So that’s a lost insight for you. So the goal here is to make sure that you’re constantly testing every user that’s coming to your site to gain additional insights. 

The second is to build a result repository that can be referenced when new ideas are needed. 

So now when somebody is coming up with a new innovative idea, you have data to either support that idea or, give the reason that that idea likely won’t work for your customers.

And then commit to finding at least one customer insight from each test. And this is where I see really good testing programs become amazing testing programs. If you’re reading your test result as win versus loss, you are not getting all the value you can from every test. 

You should have at least one insight from every single test you run and learn at least one new thing about your customer outside of this feature versus this feature does not work. 54% of innovating organizations have trouble bridging the gap between the innovations strategy and the larger business strategy.

And I wanna highlight this by telling you about how a services site that I worked with uses consistent testing to support innovation. So this company had a C-suite push for innovation. A pretty regular C-suite push for innovation. What tended to happen was conversion rates would drop, and then an executive would come to the digital team and say, “Hey. We need to innovate.”

Something’s happening. We’re not growing fast enough. We need innovative ideas right now. And this was their historic approach to that. They would push for that innovation.

Other projects would go on hold. Competitive research would be sped up, we’d rush all of the analysis to come up with these ideas and ultimately launch with something that we weren’t they weren’t sure would work or not. We’d be hitting that panic button and trying to launch something as quickly as possible. So instead, the next time this happened, we developed a strategy so that we would have a way to address that panic before it happened. And the way we did that was we developed quarterly alignment on business schools.

So getting the CRO team was in lockstep with the executive team on business goals. What that first allowed them to do was come up with a testing roadmap that fits up into those schools, which was great. So they were executing test ideas that supported business goals right from the get-go. And then they analyzed and developed insights based on those tests and documented the results and iterated on the learnings. It was developing this process that constantly fed new ideas that supported business schools into a testing road map and then into insights that allowed the organization to constantly be either innovating or developing learnings that would help them innovate down the road.

And that’s where iterative testing comes in and can be very powerful. Because if you’re developing an insight for every test that you run and then you’re iterating on those test ideas, you’re creating a culture of innovation that is constant rather than that panic, that CEO coming in and saying, “Hey. We need to do something.”

We have a plan that we’re constantly in place through testing. So what this looks like now There’s a results repository filled with meaningful actionable insights. And if you’ve ever had a results repository before, maybe it’s worked and maybe it hasn’t for you in the past. But what I found is that, especially if let’s say you’re a repository and it doesn’t need to be fancy, say it’s in a spreadsheet that can be accessed by anyone in the organization. The key here is to send out that repository in reminders as much as possible. I’ve sent out that repository with test updates every single week for my clients so that they can see, hey. This is where all of this is where all the results live.

And I can access them at any point. Otherwise, I’ll forget. And it looks like a road map that already pushes innovative ideas forward. So when that executive comes down and says, hey, we need to be innovative. How are we doing this?

Then, we can. I’m sorry. We can make sure that that executive knows exactly what is going on and that we have an innovative plan going forward. And there’s just a deep organizational understanding of that core customer. 

The second thing that helps innovation is conducting upfront user research. And so this requires pretest data for innovative ideas, and pretest data can come in the form of surveys, focus groups, user testing, and just observation.

I’ve done user testing in a way that you know, I’ll even take a few people I know from that target market and I will observe them, and ask them questions. And, of course, we sometimes find that this step can be skipped because it’s not statistically valid. However, user researchers use to try out ideas for committing time and money to a big project, asking questions that will help you uncover insights into ways that make that new product better, and then helping your optimization analysts know what to look for once the innovation is live. So don’t skip this step. I can’t iterate this enough as your first step.

The testing is really what’s going to help you validate those ideas, but you can learn a lot from your user ahead of the actual test through user research. So I’m gonna talk now about how an insurance site used user research to lower innovation risk. 

If you know anything about the insurance industry, you know that there is a very short window where most customers can change their insurance plans, and that’s called open enrollment. So we’re constantly in this industry. So the majority of revenue comes from this 1-month window at the end of the year, and it’s difficult to test due to low traffic throughout the year in preparation for that last month or the later month. We have these three quarters where we don’t have a lot of test traffic to access. And then the team wants to be innovative, but they sell bottleneck by the short time frame at inability.

So the team knew it was even more important to get things right because you can’t pivot quickly in that 1 month. You’re spending the entire year prepping for that month. And you don’t have a lot of options to say, “oh, this is how our test traffic is responding, and this is how we need to change during this time period.” So instead, they used the first half of the year to analyze and use user research to define their test ideas for the end of the year. This is what the calendar looked like.

In January, they would review the results from the year before, And then in February and March, they dedicated their time to user research. And then in April, they developed concepts. And then in May, they tested those concepts to see how their poor converting customer was responding to them so that by the time we got to June, dev work began, and we felt confident about the optimizations that we were launching. And so this created a consistent conversion rate increase year over year because we were using the data from the year previously, developing our insights, and then really leading on user research to help us drive the process for innovations. 

And so it was a really exciting outcome, for that team to be able to say,

“Every year, We don’t even have a lot of test traffic to do actual A/B tests throughout the beginning of the year, but by the time we launch these tests, we have a higher rate of getting our optimizations right because we’ve done so much user research upfront.” 

And finally, the 3rd way that we can help bridge that gap between innovation and CRO is to break innovations into several paths. Sometimes it’s possible to test an innovation. And I think sometimes that we forget this. So to find out if there’s a way to take a really big project and break it into several small projects is to ask these questions.

Are there any pieces of this innovation that don’t need to be tested? Are there any functional changes that can be tested on their own? Are there any pieces of the innovation that are particularly risky and can those pieces be tested on their own? And then can a multivariate test be used to lower risk? 

How an e-commerce store team tested into an innovative experience.

So this e-commerce team wanted to redesign an outdated product page. And this is something that I have. This is a particular client of mine, and you’ll notice that I’m not giving the exact innovation away here. You’ll see in the next slide that I’m using kind of a hidden example, but many of my clients have tried to redesign product detail pages here. And What typically happens is the client will come or the innovation team or the UX team will say, 

“Hey. We have an outdated page.”

We want to update this page and make it better. And then they have all these different changes they want to implement. So the product team was responsible for creating this new product detail page, but the CRO team then found it as a high-risk change that would be difficult to test. And I see this, like I said, all the time. So the best pants are not the actual product detail page that we were testing, but this is an example of what I see often.

So we have this outdated product detail page and the UX team says, pay, I think we can make it better. So they add new image functionality. They add a fee so that users can pay in increments. They added icons to the buy grid and a sticky add-to-cart button. All of those items could be great for the business, but they also have some risks to them.

Maybe the new image functionality isn’t as intuitive for the user. Maybe adding Afterpay distracts the user from making a purchase. Maybe the icons on the buy grid aren’t as explanatory is just the wording. And maybe that sticky add-to-cart button gets in the way of someone’s buying experience. So to test this efficiently, we broke each of those items up into their 2-week test.

We lowered the confidence threshold to 80% and we were comfortable using directional test results knowing that eventually we would test this whole new feature on its own. So what this allowed us to do was use A/B testing in a way that kind of broke the rules in order to get an understanding of how the user would respond to a test. Before launching that statistically significant product detail page redesign. So In this case, what tends to happen is you’ll find that 1 or 2 of those new features actually do hurt conversion and it allows the team to quickly pivot, go back to the drawing board, and then redesign that page so that once the full page is tested, it has sees a conversion rate increase. And that’s why the most important part of this strategy is to eventually test the full page on its own.

So we’ve covered 3 strategies here that will help you balance innovation optimization. The 3 are committing to a testing roadmap, conducting upfront user research, and breaking new innovations into several tests. Following these three strategies will allow the CRO and UX teams to develop innovative strategies based on customer data and insights. So instead of responding to our C-suite when they ask for innovation, we already have data-driven innovations, roadmaps and ready to go. We’re launching innovative ideas that are user-tested so that the team will have confidence in the idea before it goes live.

That’s so important. I gave the example of the insurance company that had a road map every single year that led them to the end of the year. Maybe that’s not how your company functions. And maybe there just needs to be a smaller window of your user research where you can conduct that user research before the test so that you feel confident about that test idea before it goes live. And finally, building a test roadmap that allows the team to break innovation down into several small tests.

That was the PDP e-commerce example where instead of launching a risky big change on a page that changes functionality design and a slew of other changes. You’re breaking those ideas into 3 or 4 tests. Running quick tests on their own before you launch the brand new feature altogether. So now you know how to strike a balance between innovation and optimization, how having a consistent testing roadmap is the key to optimization, and how upfront user research can support the innovation process. And the questions that you can ask when testing new innovative ideas.

So I’m going to turn this over to all of you, and I’d love to answer any questions that you have based on the presentation.

 

R:

Thank you, Sam, for walking us through today’s webinar. I think everyone had a lot to take away from what you shared. I see there’s one question in the chat box. My website isn’t getting a lot of traffic. They’d like to understand how they can use your 3rd, practice breaking innovation into several tests without running long tests.

 

SB:

Yeah. That’s a great question and one that comes up quite often with the clients that I work with. I think the key here is to break your ideas down into bigger ideas. So maybe you need to combine, for example, the Afterpay and the sticky add to cart button so that there’s a more obvious change, and then test the other 2 later. So the bigger the change the less time you’ll need to see some sort of result. And then, of course, remember, this isn’t the end of all of this testing strategy.

You are then running the bigger product detail page test after that. So these many many tasks we’ll call them are meant to help you inform whether or not that final test will be successful or not.

 

R:

Great. In case anyone else has any other questions, please feel free to use the chat window and I think Sam will be happy to take over those questions. We’ll wait for another couple of minutes. If not, I think I’d like to thank everyone who attended this webinar. We’ll be sending over the presentation and the recording of today’s webinar over an email.

In case you have any doubts, any questions, you can reach out to me or Sam, and we’ll be happy to get in touch with you. Thank you.

  • Table of content
  • Key Takeaways
  • Summary
  • Video
  • Deck
  • Questions
  • Transcription
  • Thousands of businesses use VWO to optimize their digital experience.
VWO Logo

Sign up for a full-featured trial

Free for 30 days. No credit card required

Invalid Email

Set up your password to get started

Invalid Email
Invalid First Name
Invalid Last Name
Invalid Phone Number
Password
VWO Logo
VWO is setting up your account
We've sent a message to yourmail@domain.com with instructions to verify your account.
Can't find the mail?
Check your spam, junk or secondary inboxes.
Still can't find it? Let us know at support@vwo.com

Let's talk

Talk to a sales representative

World Wide
+1 415-349-3207
You can also email us at support@vwo.com

Get in touch

Invalid First Name
Invalid Last Name
Invalid Email
Invalid Phone Number
Invalid select enquiry
Invalid message
Thank you for writing to us!

One of our representatives will get in touch with you shortly.

Awesome! Your meeting is confirmed for at

Thank you, for sharing your details.

Hi 👋 Let's schedule your demo

To begin, tell us a bit about yourself

Invalid First Name
Invalid Last Name
Invalid Email
Invalid Phone Number

While we will deliver a demo that covers the entire VWO platform, please share a few details for us to personalize the demo for you.

Select the capabilities that you would like us to emphasise on during the demo.

Which of these sounds like you?

Please share the use cases, goals or needs that you are trying to solve.

Please provide your website URL or links to your application.

We will come prepared with a demo environment for this specific website or application.

Invalid URL
Invalid URL
, you're all set to experience the VWO demo.

I can't wait to meet you on at

Account Executive

, thank you for sharing the details. Your dedicated VWO representative, will be in touch shortly to set up a time for this demo.

We're satisfied and glad we picked VWO. We're getting the ROI from our experiments.

Christoffer Kjellberg CRO Manager

VWO has been so helpful in our optimization efforts. Testing opportunities are endless and it has allowed us to easily identify, set up, and run multiple tests at a time.

Elizabeth Levitan Digital Optimization Specialist

As the project manager for our experimentation process, I love how the functionality of VWO allows us to get up and going quickly but also gives us the flexibility to be more complex with our testing.

Tara Rowe Marketing Technology Manager

You don't need a website development background to make VWO work for you. The VWO support team is amazing

Elizabeth Romanski Consumer Marketing & Analytics Manager
Trusted by thousands of leading brands
Ubisoft Logo
eBay Logo
Payscale Logo
Super Retail Group Logo
Target Logo
Virgin Holidays Logo

Awesome! Your meeting is confirmed for at

Thank you, for sharing your details.

© 2025 Copyright Wingify. All rights reserved
| Terms | Security | Compliance | Code of Conduct | Privacy | Opt-out