Follow us and stay on top of everything CRO
Webinar

Turn Visitors into Buyers with UAM Method

Key Takeaways

  • Implement the UAM (Understand, Analyze, and Monitor) method to turn visitors into buyers. This method is data-backed and research-based, ensuring high-impact testing programs.
  • Conversion Rate Optimization (CRO) is not just about improving conversion rates and average order value, but also about optimizing a business towards profitability. Monitor other metrics like return rates, frequency of purchase, and revenue per session.
  • CRO is not just about A/B testing. While A/B testing is a key part of CRO, it's not the only thing involved. It's about optimizing the most important factors on a website to improve the on-site experience.
  • Keep in mind that while the primary targets are conversion rate and average order value, if the business isn't profitable or making money, these metrics don't matter.
  • Always consider the potential impact of tests on other aspects of the business, such as returns rate. A successful test is not just one that increases conversion rate, but also one that doesn't negatively impact other important metrics.

Summary of the session

The webinar, hosted by Divyansh, featured Will Laurenson, CEO of Customers Who Click, who shared valuable insights on enhancing customer experience and boosting conversion rates. 

Will emphasized the importance of understanding customers’ desired outcomes from a product and how a website can motivate them to make a purchase. He also discussed the strategic use of scarcity and urgency, cautioning that these tactics must be used carefully to avoid negative impacts. 

Will also addressed the importance of proper testing and avoiding premature conclusions based on initial results. He stressed the need for sustainable long-term improvements and customer satisfaction, suggesting the use of usability optimizations and tackling customer anxieties. 

The session concluded with a Q&A, where Will addressed questions about data discrepancies across different platforms.

Webinar Video

Webinar Deck

Top questions asked by the audience

  • Hey, Will. Which framework are you using for test prioritization?

    - by Chetna Agarwal
    It’s based on the CSL framework, we've just tweaked a couple of things. Yeah, we've just made a couple of changes that we think are more appropriate, but it's heavily based on that.
  • I’m a very small business owner. Is there a program someone like me could afford? What do you say is the base rate for basic help?

    - by Sarah Brendle
    It's a little difficult to answer that without knowing what the business is because that would determine what the best fit would be. You know, if it's like a small B2B business, it's gonna be very, ve ...ry different. So we're performance-based, so we don't use base rates. So it's a little difficult to answer that, but guess you need to speak to some people and just be wary of anyone who is charging too little and who kind of promises the world and says they get their clients 40% uplifts in conversion rate and, you know, big claims like that because either they're lying to get your money or possibly what they're doing is they're not testing properly. Generally, what will happen with a test is, you'll see a big spike in conversion rate at one point in the test. But then by the time you finish the test after 2, or 3 weeks, you end up with that kind of 5% uplift. I remember one actually, generally when he thought this was gonna be a massive win, an unbelievable win. After about 10 days of the test, we were up 140%, I think it was the conversion rate. Which was just, it was just insane. But because it was about 10 days in, I was thinking, wow, this looks fantastic. Ran it for a bit longer to make sure we got enough data through it, and it ended at about 8 or 8.9%, which I think was the final result, which is still a fantastic uplift from one test, but it just shows that if you get a freelancer, someone who ends that test early because it's up 80% or something, that's not actually what you're getting.
  • What strategies or techniques do you suggest to your clients to not only improve conversion rates but also ensure a sustainable, long-term improvement in overall user engagement and customer satisfaction?

    - by Imran Shaikh
    Yeah. I guess that is kind of the UAM, methodology really. So now, if you're making usability optimizations, you should improve user engagement. If you're tackling people's anxieties properly, and the ...n also focusing on that motivation piece, that's gonna help you build that kind of sustainable long-term improvement and customer satisfaction. So, base it on research and speak to your customers to find out what they care about, right? An important part of that is the wording and the voice of the customer. So you could do that research and read through that research and say, yeah, that, that is what we're saying on the website. But with, you know, but you're saying one thing. You're not using the exact words they use. And so they don't understand it. One example, I was looking at a website yesterday for an audit that we're doing. And, in there, you know, the 3-digit code, well, I think it's 4 digits on American Express, the security code for your credit card. Most people know it, as CVV, I think. I think that's correct. But they had CSC in their little form. So it's a small thing, but because they're using different letters compared to what people know, that might be something that just puts people off. Right. That's a very specific example, but it could just be the wording you use on how you describe the material that your product is made of, right? You might actually be describing the same thing, but some people just know it as a slightly different term.
  • What are the 4 tips for doing on-site research for a better buy journey?

    - by Manny Flores
    So we start with Google Analytics, and we'll try and identify in the eCommerce funnel where the biggest drop off is, where we think that opportunity is. To be honest, the vast majority of the time, it ...'s the product page. Right? So people, people just aren't selling their products well enough. They're not dealing with the anxieties and the concerns. They're not motivating their customers enough. That's generally where we start, then we move to behavioral stuff. So these will be heatmaps and session recordings. What are what are people doing on the website? Where are they clicking? Where are they not clicking? Are they dead clicks, or rage clicks? But, really, we're looking at, whether they are seeing information that we think is valuable, to the purchase. So if you've got a video that you are adamant is responsible for converting people and is super valuable, then we'll look at scroll maps, for example, And a lot of the time, because these videos are halfway down the page, we'll find that you know, maybe 50% of traffic sees them. They're not actually contributing as much as the brand thinks. Then we'll move to some surveys. So, we normally do email surveys now. I used to do a lot of on-site pop-ups. We found them not as effective so much anymore with, you know, with so much mobile traffic. The pop-ups are just a bit more disruptive and they can be a bit damaging to conversion rates. So we tend to do email surveys now. So we'll do an email out to existing customers and an email out to leads, people who have given their email addresses, but not purchased. And we're asking for those people, it's, you know, what's the number one reason you haven't purchased from this brand. And then for both of them, we're kind of going into what would you care about with these products? What are you trying to achieve with it? Why why are you buying it? Why did you buy it? You know, all these these questions around people's behaviors, people's motivations. And then the final piece of that, which will be customer interviews. So we'll also get people on a Zoom call or Google Meet, and we just kind of dive into that a little bit more. But here, it just gives you that opportunity to ask follow-up questions. So I'm not gonna not gonna go through an example in my head. But, yeah, you'll ask that first question of what's what's the number one reason you haven't purchased yet. They give you an answer, and then you can say, oh, well, what is important to you about x that they've mentioned, right? Why is that important? And then you can kind of go deeper and deeper into that and discover what their key motivation is. You know, it's a bit like, you know, if someone said, yeah, a good example would be price. But when people give price as their objection, it's actually very rarely price, which is the problem. The problem is they haven't been sold on the product properly. Right? So they haven't had their questions answered. They haven't been motivated by it. They're not seeing the value of that product, and that's where you get it and go into things a bit in a bit more depth. Alright. So that's that's what the interview allows you to do. I suppose I'm gonna add another one. I'll add a lot of 5th. When you run tests, always make sure you're iterating, you're learning and iterating from them. So if you get a positive result or a negative result, you should still be asking the question of why, why did this happen, why did we get the result, whether it's expected or not, and what do we do next? So if you run a bit of social preview on your website and you get a positive result, the next step should be, how do we get a bigger positive result from this? How can we make this a better experience? Because, you know, we're testing, you know, it's it's like deploying MVP stuff or minimum viable product stuff. So, the first version of that social proof might be to put, a before and after review into your product gallery. Works really well. Positive improvement. That's an easy test because you've just gotta get a review from someone and put an image in place. Then you might think, well, let's do a video version of this. Let's see if we can get someone to take a video of themselves before and a video of themselves after using our product, right? That's obviously gonna take, you know, a bit of time, to sort out, but that doesn't matter. You get into that video and then you might find you get a better increase. And then you say, well, actually, maybe the next step for this is to do a little tab or something on our product page, which says, see what our customers think or see our customers' results, and that's where you can have a mix of both image and video before and after. So you gotta keep iterating on it. You find something that you think is impactful at first, and then you go deeper with it.
  • Do you use Bayesian statistics to decide whether or not to declare a variant a winner? How do you ensure proper data collection and what do you do if there are discrepancies between your testing tool and internal reporting?

    - by Chirs Holland
    Yeah, we do. We use Bayesian statistics. We do we do quite a thorough analysis of a test, to make sure it's sufficiently powered. You know, we've run enough data through it. And then we do some analys ...is on segments and, channels, devices, that sort of thing to see where the impact's really been. We also do quite detailed, analytics and audits set up, right at the start of of working with someone. So I can't go into too much detail on that because it's it's the team who actually executes that. But, yeah, we make sure people are set up properly, we run some practice tests first as well, to make sure that the data looks correct, but it's also at the end of the day, the data is always going to be different between things. Ideally, we want it to be a very, very small difference. But that's, between the testing tool and Shopify and Google Analytics, for example, there's likely to be differences. In all three of them.

Transcription

Disclaimer- Please be aware that the content below is computer-generated, so kindly disregard any potential errors or shortcomings.

Divyansh from VWO: Let me just formally welcome you to VWO webinars. Hello, everyone. Thank you so much for joining the VWO webinar where we always try to upgrade and inspire you with everything around experimentation and ...