Follow us and stay on top of everything CRO
Webinar

Your First Steps Into Personalization

Duration - 45 minutes

Key Takeaways

  • Use A/B tests to verify or deny a behavioral hypothesis, which helps in understanding what drives your digital users and impacts your revenue and conversion rates.
  • Document every detail of your A/B tests, including the name, reason, hypothesis, setup, screenshots of experiments, results on the main KPIs, learnings, and conclusions. This comprehensive report will help in future analysis and decision-making.
  • Distinguish between results and learnings. Results are a summary of your findings, while learnings answer key questions like whether the hypothesis was confirmed, what you learned about your customers' needs and motivations, and how you would change your approach based on the results.
  • Use evidence-based prioritization to get the most out of your meta-analysis. This will help you build on your learnings and successes, leading to more winners and strengthening your data analysis with more proof.
  • Share your insights and learnings from your experiments with other departments in your organization. This can benefit online and offline marketing, product innovation, and even higher management.

Summary of the session

The webinar, led by Johann from AWA Digital, focuses on the importance of learning from experimentation and AB testing in businesses. Johann emphasizes that only 25% of experiments typically result in improvement, but the key is to learn from all outcomes. He introduces the concept of a 100% learning rate, transforming a 75% failure rate into valuable insights. He also discusses the importance of understanding the customer journey on your website and digital products, and how this knowledge can be used to prioritize test ideas and experiments.

The webinar is interactive, with attendees encouraged to ask questions throughout. Johann also highlights the success of tech giants who have an experimentation culture, showing how their growth outpaces other companies. The goal of the presentation is to provide attendees with easy tweaks to increase learning and decrease biases, using data analysis to identify which hypotheses to address.

Webinar Video

Top questions asked by the audience

  • Is a low-volume website worth doing any personalization?

    - by Shane
    Very good question, Shane. I would say, well, of course, you have to define low volume, but I would take it a step further, and I would say is low volume websites worth doing A/B testing. And, I mean, ... going back to I hope you can still see my screen, Shane, going back to this spectrum over here. I mean, this really with any website, even I work with websites that I've masses of data, huge retailers, even there, we start over here. This is where you're gonna make the biggest impact. And with many of those big retailers, it's quite a while, a year longer even before we start stepping into some of these areas. it's not quite like that. It's in parallel, but certainly, I would encourage you whether it's low traffic or high traffic to exploit the larger audiences. These two are on the left, here, as much as possible before you start really investing in the smaller audiences. don't be scared to venture into it because you'll learn that way as well. But, with a small audience and, again, you have to define that, but with a small audience, I'd be, inclined to grow and optimize through A/B testing and, acquisition first and make the most of that audience before I start slicing it. There might be, in your specific, case, it's really you've got to treat it on merit. So before I was with the agency, I worked with, one of my clients who was a relatively small, eCommerce business, not massive, traffic numbers. In that case, it made sense to personalize based on location, because it was clear in the numbers. There was such a distinction, such different behavior between the 2 dominant locations. So that's the core we made there. I hope that helps, Shane.
  • Hi, Johann, and, Vipul. Thank you. So, because we were talking about the low-traffic websites, right, and especially around personalization there. One technique that is quite useful in such a scenario when we do CRO of maybe experience imagination or whatever you wanna call it, A/B testing. User research is used a lot where we actually talk to people and understand what their challenges are while completing a journey. How do you see that working in the case of personalization? When you're doing this, say, talking to 20 people, yeah, you cannot cover as many personas as you would like to do in case of personalization, but it may work for AB testing. So how do you see that working here?

    - by Rajneesh
    That's such a good question, and thanks for asking it because this is quite an important topic actually on its own. So the way I'd like to think of it is validation right, validation ideas. And with A .../B testing, we're assuming that you've got enough samples. You've got a big enough audience as you alluded to as well to be able to run A/B tests. And, of course, the smaller your audience, the smaller your website, you know, the less accessible may be the less relevant that is to you, but that's not the only form of validation. It happens to be a very good form of validation, the best form of validation. But there are ways to validate concepts, theories, and hypotheses. If a, you don't have enough of an audience, or, b, even if you do have an audience, you do have enough traffic, but you want to validate the concept before you get to the execution. Let me think of a quick example because we actually do a fair amount of this. So, the way you would wanna do that, the principle that I'd like to use at any point is at this time, when you're talking about a particular idea or concept, what is the next thing I need to learn in order to move forward? And what is the best way, the cheapest, and the quickest way to learn that thing? If the answer is an A/B test, great. But a few steps before that, maybe the answer is a couple of customer interviews. Maybe the answer is a survey, maybe the answer is, some prototypes that I put in front of users, customers. So something that I've done with a retail client who's an omnichannel business that sort of has stores as well as the website and the app. We go into the store, and we approach shoppers in the store and we give them a wireframe. And often this is a hand-drawn wireframe. So it's very rough. And, you know, the kind of things we'd ask them is, you know, where would you click? And what do you expect would happen next? And, of course, we got a goal. You know, we give them a scenario, and they show us in this piece of paper. Okay. I click there, And then the question, what do you expect next? And, you know, you're kind of, on just a very rough low phi prototype. You get that immediate feedback. Based on that feedback, we then quickly drew another wireframe. We show it to other soft shoppers. Hey, you know, maybe we've got a sample of 20-30 more in a day. That's great, but that is not about sample size with A/B testing. That scientific rigor, that statistical rigor is incredibly important. That's the point of A/B thing. The point of these exercises is that early-stage validation is not statistically significant, and is not Bayesian probability. It's not that rigor. It's about validating theories. It's about validating assumptions. So any idea, any concept, any new concept, is based on assumptions. And what we're really trying to do is we're trying to find the quickest cheapest route to validating our assumptions and early on, it's customer interviews. It's not about sample size. It's about validating and moving us forward. And, you know, later on, when you get to the execution, it takes the form of an A/B test. I hope that makes sense.

Transcription

Disclaimer- Please be aware that the content below is computer-generated, so kindly disregard any potential errors or shortcomings.

Vipul from VWO: Hey, everyone. Welcome to the VWO webinars. Just give us another 30-40 seconds to let more people join in, and, we will kick start with the official version session. I see more people are joining in. So ju ...