• +1 415-349-3207
  • Contact Us
  • Logout
VWO Logo VWO Logo
Dashboard
Request Demo

Managing and Scaling Experimentation Programs at Global Companies

Dive into Simon and Johann's chat on tackling the tricky bits of global experimentation, from team vibes to cultural quirks.

Transcript

[NOTE: This transcript is pending review, hence, may contain errors. The final version of the transcript will be available soon.]

[00:00:00] Vipul Bansal: Today, we’re going to have Simon Elsworth, who is the Global Head of Experimentation and Digital Analytics at Whirlpool Corporation. 

[00:00:17] This is going to be an interesting conversation which will be moderated by Johan, who is the COO at AWA Digital. So I would request both of you gentlemen.

[00:00:27] Simon and Johan to switch on your cameras and mic so that everyone can see you. There you are. Great. 

[00:00:36] So, I’ve already spoken enough. I’ll just jump off the stage now and hand over the mic to Johan. 

[00:00:44] Johann van Tonder: Simon, good to see you again. Global head of experimentation and digital analytics at Whirlpool.

[00:00:53] That’s a mouthful, man. What does it mean? 

[00:00:56] Simon Elsworth: Sounds fancy, right? 

[00:00:59] Johann van Tonder: Can you give us some colour into that role and responsibilities? What do you actually do? 

[00:01:05] Simon Elsworth: Yeah, I’m not quite sure. That’s what my mum asks literally all the time. I guess in simple terms, most of my day is basically just taken up by talking to people which is a lot of fun, right? 

[00:01:24] Sometimes it’s specifically about testing. Sometimes it’s specifically about analytics and data. Sometimes it’s just like what people are thinking about and how can I help solve, how can I help them achieve it? 

[00:01:40] But I guess realistically other than the kind of obvious help us sell more stuff, help us create a better experience, help us cut costs, reduce wastage.

[00:01:53] I guess the main thing for me about my role is, it’s basically about enablement, right? And I guess most of this is done day to day building out a global community around experimentation within Whirlpool. 

[00:02:12] So focusing around sort of evangelization, both top down and bottom up. So if you think like a business the size of Whirlpool like running multiple markets, multiple brands globally, right?

[00:02:29] We’ve genuinely got multiple teams across multiple markets that are all working on CRO, A/B testing, experimentation that you just use whatever terminology you fancy. 

[00:02:44] But I guess for me as the kind of global head, my kind of real focus is to work on connecting all of these teams and helping them to really find a sense of belonging with their peers and if you think about distributed. 

[00:03:05] That the teams can be genuinely across different regions, different locations, like a lot of the time, the people that are in these teams that are out in the markets running every test every day, they have no idea that there are other groups of people in the same organization that are basically doing the same job, right?

[00:03:26] So I think a lot of the time is focused around trying to break down those kind of organizational silos. I think that’s probably one of the most important things that I can do within my role. 

[00:03:41] So we can start to be more and more global. We need to create the ability and the space for experimentation, for people to share these learnings with each other globally and making sure the right capabilities, the right tooling, the right frameworks are all kind of in play, right?

[00:04:04] And I guess thinking about helping all of these people that are out there feel connected to both Whirlpool and the career, like understanding that there’s a career path out there for analytics and experimentation. 

[00:04:21] Even within Whirlpool, there’s plenty of scope to move around for jobs within that thing and I guess for me the better we get as an organization are sharing the better everything becomes right and that’s not just about focusing specifically on individuals but also for Whirlpool. 

[00:04:43] The more we can share learnings globally and the more we can build a global understanding of our customers, the better we can serve those customers, the better experience we can build and I guess at that point, everybody wins, right? 

[00:04:54] I think the one thing I don’t do which is, anybody that’s actually worked with me or knows me will be very supportive of the one thing that I don’t do is I don’t run tests. 

[00:05:10] I don’t build tests, I don’t analyze tests. Everything I do is around sort of like enablement and community building.

[00:05:19] Johann van Tonder: And orchestration.

[00:05:21] Simon, there are a couple of points you made there that I want to pick up on and by the way a reminder to the audience if you have questions, will Simon please pop them in? 

[00:05:31] And we can feature them now. Simon, you spoke about a career there and I want to touch on that. If somebody listens to you as global head of experimentation at Whirlpool, massive organization, what is your advice to somebody who’s at the bottom of their career to end up where you are now?

[00:05:51] Simon Elsworth: That’s a really good question and we’ve only got 30 minutes. So, I’ll try to distill it into a short point. I was quite lucky. I started in experimentation like 20 years ago, right? Before experimentation was even a thing. 

[00:06:08] This is before like Google Optimize, the first iteration was kind of around. So like for me, I spent a lot of time working in a field that didn’t exist. 

[00:06:22] So my career path is probably going to be very different to most of the people that are joining now. I think the best advice I could maybe give is to try and understand when and where you need to be really technical.

[00:06:42] I think we spend. I don’t know, maybe we’ll cover this in depth a little later, but we spent quite a bit of time talking as an industry about really technical things, which as a practitioner, you need to know all of these things. 

[00:06:59] But I think if you want to really push a career and you want to get to a more senior position. I think you need to understand how to package up these technical and let’s be honest, quite boring sometimes. Like concepts for people to understand. 

[00:07:23] One of the things that I’m going to say I’ve learned, but I feel more like it was taught to me by a couple of my previous people leaders is, 

[00:07:35] it’s very much understanding that sometimes the role that you’re in is not necessarily the role that you think you’re in and sometimes part of your role, whilst you wish it was all funky.

[00:07:49] A/B testing, running tests, designing tests, measuring tests wherever you’re passionate. Sometimes a lot of the time needs to be spent on things like change management and understanding how to coach people through changes and how to bring in new skills.

[00:08:04] And I think that would be my advice for somebody that was looking to build a career in experimentation, is to kind of understand the kind of wider fringes of it and how you can sell that upwards within an organization without getting too attached to the technical details.

[00:08:24] It’s hard. It’s definitely something that I struggled with. A few years back was that kind of transition from practitioner to strategist. I don’t know. It sounds a bit fancy. 

[00:08:37] Johann van Tonder: We’re going to definitely get into the dirty, boring, dull technical stuff as well. I’m sure there are many geeks on the call who would like to get into that.

[00:08:50] But Whirlpool, at least from the outside perceived to be a traditional organization, huge history. I think it’s over 100 years old. 

[00:09:00] Do you have any idea what led to that initial investment in experimentation at Whirlpool? What did leadership want to achieve? Any idea into what was the drive behind it?

[00:09:11] Simon Elsworth: This would be a very short interview if I just said no, right? So yeah, I think and this is maybe quite a boring strategic type answer to it, answer the question, but I think in real terms the kind of decision to create an experimentation function within Whirlpool was it was basically driven by an understanding that like fundamentally customers are moving online and we need to be able to service these customers and engage these customers through the kind of medium that they’re choosing, right?

[00:09:48] So we, the royal we. At Whirlpool, we created a kind of strategic, a global strategic imperative. So this is a top down global alignment to win the digital customer journey and I guess how we decided to support that imperative from a digital view was to create a global digital center of excellence, expertise.

[00:10:20] I always pick the wrong one. It’s one of those two and basically the idea of the group is to accelerate the kind of the digital learning and the digital capabilities within Whirlpool at a global scale, right? 

[00:10:34] So this is about trying to make sure that kind of digital expertise and those digital skills can start to run just basically run like blood through the organization, right?

[00:10:45] So people understand that it just becomes a way of working and it’s not just experimentation, not just analytics. 

[00:10:54] So, the team I guess, like my team we cover all of the core kind of digital skills. So that we have especially set looks up to CRM. We’ve got a special looking up to CRO, Martec, cyber security pretty much every digital skill that you could think of, there’s a version of me running that work stream. 

[00:11:14] But traditionally, the view is definitely yeah Whirlpool is this manufacturing company, like 110 years of history. I think selfishly, what I really liked about my kind of initial engagement with Whirlpool during the kind of interview phase and all that kind of thing was we kind of got this huge global mega brand, right? 

[00:11:45] Multiple brands, multiple markets, multiple countries and I think in all honesty, that kind of exposure and knowledge about digital particularly digital experimentation is it’s pretty low, right? 

[00:12:00] But the thing that I soon picked up on was although they know very little about digital experimentation which is fair, not many organizations do right.

[00:12:15] They were basically about to embark on a huge global experiment about how to create a global experience, right?

[00:12:25] So everything in my team is an experiment. The global setup of the team. We’ve got specialists who are based in EMEA, we have specialists based in Latin America, in San Paolo, in North America.

[00:12:39] We have people that are fully globally remote. I think I was literally the first globally remote employee at Whirlpool. 

[00:12:49] I think it was things like that kind of made me think hey, this is a company that whilst they don’t really know that they’re into experimentation, they actually are in a big way that they’re actually doing experimentation in a way that a lot of organizations would actually dream of.

[00:13:08] Even organizations that have best in class sort of A/B testing teams and not running experimentations through how they hire, how they start, how they build a workforce. Right? 

[00:13:19] And I think, that’s the thing. Paul’s always been an experimentation driven business. They just didn’t really know.

[00:13:27] I don’t want to spend time because obviously we’ve got a lot to get through but I can’t think of another organization who has an experimentation culture, which has enabled them to put a fridge on the International Space Station.

[00:13:46] I think if you think of the business and organizational sort of changes and experimentation and prototyping to work with somebody like NASA. You know, Whirlpool is good experimentation.

[00:14:01] My role sometimes is backwards to a lot of other roles. It is almost I’m not trying to teach some of the other businesses, some of the other areas I’m not trying to teach manufacturing how to run experiments. I’m trying to teach the digital teams that we already run experiments.

[00:14:19] So why can’t we do that in digital? Which feels like a very strange way of looking at things. But yeah, I think that’s where it came from was this kind of like understanding that we need to know more about digital. We need to know more about how to service customers online.

[00:14:39] Johann van Tonder: Now, we are going to get slightly more technical now. Let’s start with the velocity versus quality debate that you see on LinkedIn and Twitter and everywhere else. 

[00:14:49] Where do you Simon sit in that debate? Do you believe in running more tests or do you believe that it potentially poses a threat to quality?

[00:14:58] How do you find that balance? 

[00:15:00] 

[00:15:00] Simon Elsworth: I guesss, there’s an easy answer to this which is, there is no primary, it’s both, right? You need to understand both. I think if you look at this honestly and openly, running a high volume of low quality tests, just a velocity KPI is clearly a bad idea.

[00:15:28] But from the other side, running one test a year because you’ve spent 10 months researching and trying to create this perfect hypotheses again, it is a bad idea. 

[00:15:42] And I think one of the things that’s interesting that I kind of like about experimentation is you get a chance to kind of cross pollinate ideas from other streams of experimentation.

[00:15:56] We talk about having guardrail metrics for A/B tests, right? We put these metrics in place to understand if technically a test is doing what we think it is in the early days, right? 

[00:16:06] But I think when you think of your experimentation program velocity and quality become almost guide rail metrics for your program, right?

[00:16:17] And reliance on those as guardrails will flex over time. I think as you get more experience, more mature, maybe you’ll become to rely on them less. 

[00:16:28] But I think on the velocity point, because one thing that I’ve kind of learned in my experience, multiple teams is very much a lot of the time it’s better to do something and deliver something than just talk about it. 

[00:16:47] So you need some element of velocity in there, right? Obviously, you don’t want to deliver something that’s garbage. But sometimes, it’s better to deliver something than just carry on talking about it for six months but I think what’s really interesting about this question overall for me is it’s actually bigger than this, right? 

[00:17:07] So one of the things that you need to understand when you’re thinking about metrics is very much the KPIs that you choose for your organization or for your business or for your team. They will define the experience that you build, right?

[00:17:25] So, for example, if you’re purely focused on velocity, the chances are you’re going to end up running a lot of individual small changes because you need to hit that velocity number. 

[00:17:39] I’m not saying this is the wrong thing by any stretch of the imagination, but if you specifically chose revenue as your primary KPI that’s going to define what sort of test you run.

[00:17:53] Maybe you’re going to spend a lot of time focusing on winning experiments because you’ve got a revenue number to hit. 

[00:17:59] And maybe your focus is going to be much more in the kind of bottom end of the funnel, because that’s where it’s easiest to impact revenue and maybe you’re going to miss out on a load of interesting things at the top of the funnel that are not a hundred percent aligned to revenue.

[00:18:15] But I guess the most important thing when it comes to this debate is understanding your organization and I think you need to understand what are the metrics that your organization are going to get excited about and frame your output from your testing team around those metrics and again, they can change over time.

[00:18:37] So if you’ve got a senior leadership team that are super excited about running X amount of tests, then give them the information that they want and maybe it’s not the best KPI, but if it gets them excited, it gets them on board. 

[00:18:52] That’s a way in to then start to influence and change and educate and get them to start thinking about different KPIs and I think that’s probably the most important thing is understanding what metrics you need in your organization to make your team look good, right? 

[00:19:12] Johann van Tonder: I like your guiding philosophy there of just stop talking and start doing something. It reminds me of a quote.

[00:19:18] I think it’s Mark Randolph, the co-founder of Netflix. It said something like and I’m going to butcher it, but paraphrasing. 

[00:19:24] He said something like you’ll learn more in a week of testing than a year of talking about something. We have another 10 minutes or so left with Simon. So if you have any questions for Simon, please squeeze them in now.

[00:19:37] Simon, how do you deal at Whirlpool with con-current experiments, multiple experiments running at the same time? Is there any fear of potential cross contamination? How do you handle that? 

[00:19:48] 

[00:19:48] Simon Elsworth: So I guess I can answer this. I’m going to keep this pretty short because we’ve only got like really no time.

[00:19:55] But in fact, I’m actually a huge believer in not doing that. In fact, almost doing the exact opposite of that. I think understanding how one experience or one experiment impacts another experiment and how that impacts another experiment in the customer journey is something you really need to understand.

[00:20:18] I think the thing that’s important is we need to try and replicate the real world as much as possible and I think for me understanding let’s say, variant A works better with variant B from another experience is much more powerful than understanding in isolation variant B from the first experience performs better in isolation, right?

[00:20:46] However this comes with a huge caveat that you need to be able to measure this impact, right? You need to be able to understand what that interaction is for it to be useful. 

[00:21:00] And I think for me, if you can’t measure that if you don’t have the ability to measure your interaction impact, I think you should probably reassess the things that you’re worrying about and you should probably stop worrying about trying to create isolating experiences and put that effort into understanding how to measure interaction effect.

[00:21:26] Because that will give you much more value over time and yeah there are huge limitations with this when it comes to real science, right?

[00:21:37] But I think this is where I deviate from some people in the kind of industry and probably some people on the call will be like literally screaming.

[00:21:45] Fortunately everybody’s on mute so it’s fine. Yeah it has huge limitations. But I think as an experimentation team, our focus needs to be around building a better product, building a better experience, rather than focusing on building perfect science.

[00:22:04] And again, it goes back to the thing, it’s better to do something than nothing, right? So I think yeah, it feels like a bit of an easy answer to say, don’t worry about it. 

[00:22:14] But you need to be able to measure it and it’s probably one of the more important things that you could figure out as an experimentation team is how to measure that interaction.

[00:22:25] Vipul Bansal: Right and at this point, we’ve gotten a lot of questions from the audience. So we’re going to pick only the ones which are not very tactical in nature. 

[00:22:37] There’s one question, as you can see in front of you right now. Unfortunately, Mentimeter is not showing me the name of the person who asked this question.

[00:22:44] So we have to read it out. Yeah, the question is, you have both responsibility of experimentation and digital analytics. So, in most companies, it’s not combined. 

[00:22:56] So what do you think is the biggest gain and what is the biggest risk is there to combine the both. 

[00:23:04] Simon Elsworth: That’s a really great question.

[00:23:06] I think there’s a really honest answer to this when it comes to Whirlpool is the reason it’s combined is, we didn’t have enough resource and headcount to bring in two distinct specialists. 

[00:23:19] I’m kind of looking at my background is in analytics. How I got into A/B testing was through a career in both data warehousing, what would be badged as data science these days and digital analytics.

[00:23:34] So the reason it’s combined is purely because I could do both which is great for me. Yeah. 

[00:23:42] But there is a risk and it’s something that I talked about in when I was kind of lucky enough to talk to Stefan Tom about experimentation and it was one of the things we talked about is actually if you have a specialist that does two roles, there’s always going to be one of those roles that becomes the primary, right? 

[00:24:07] And in all honesty, the likelihood is it’s going to be analytics because everybody wants data, everybody wants reports but people can sometimes take or leave the kind of testing elements of it.

[00:24:19] I think that’s the biggest risk but the biggest reward is you have the same teams who are looking at the numbers, building out the data strategy, building out the data governance strategy, who are then also building out the experimentation measurement strategy. 

[00:24:39] So you have a full alignment with the kind of business KPIs and that becomes easy, like getting everything integrated into the analytics stack becomes much easier if you’re all on the same team.

[00:24:50] So there’s definitely pros and cons either way.

[00:25:01] Vipul Bansal: Yeah. Thanks, Simon. So here’s the second question.

[00:25:08] Johann is your audio working? 

[00:25:10] Johann van Tonder: Yeah, are we doing the question on the screen? Simon, it’s a big one. I’m sure you’ve had it before. How do you change the culture of a company that doesn’t currently believe in experimentation? 

[00:25:24] 

[00:25:24] Simon Elsworth: Yeah, that’s a big question. I guess you really need to show the value in the things that you’re doing and how you’re making a difference, right?

[00:25:37] And that’s the only way you’ll do it. It did slightly easier to answer from the other way. The way that you won’t change that is showing everybody how technically amazing you are and how statistically strong you are, what your statistical prowess is, that’s how you won’t change it. I think the single biggest answer to this is get somebody else to do it for you.

[00:26:06] And that’s the real best way to do it. If you think of yourself as an experimentation expert and you’re in the business, you’re going to meet your leadership team, your senior leaders, whoever and talk about experimentation. 

[00:26:21] You’re the guy from the experimentation team who’s telling people that they should be running experiments, right?

[00:26:25] You’re a little bit biased and people will understand that. It’s basically the same as anybody cold calling you about a new analytic product or whatever. We all get this every day in day out, right? 

[00:26:37] It’s the best thing in the world. The salesman is the most passionate about this new tool and I think a lot of the time when you think about what you’re doing as an experimentation expert within an organization, you’re basically doing the same thing.

[00:26:50] You’re basically cold calling your leadership team and saying hey, this thing that I’m super passionate about you should do it. 

[00:26:55] The easiest way to get around that again is to get somebody else to do it. So one of the things that made a huge difference in a previous role was getting our finance team on board and when you have a finance team that has talking about hey, why aren’t you running experiments? 

[00:27:17] This team are in experiments and they’re giving them all this data and we can understand the ROI for things that they’re delivering and all of this cool stuff. Why aren’t you doing that? That’s how you get by.

[00:27:26] A role before that, it was the compliance team, we have the compliance team saying to other teams, hey these guys are running their kind of changes as tests, as experiments. 

[00:27:38] So they can do it safely. They can switch them off and all this kind of like we can do within a controlled environment.

[00:27:44] Why are you doing that? And I guess it comes down to, find the team within your organization who get to say no to everything and there’s always a team that gets to say no to everything, whether it’s finance, compliance, legal whoever that is. 

[00:28:03] Get them on board, focus all of your energy on getting them on board and they get them to evangelize experimentation for you. 

[00:28:11] That’s the way to change it. That’s the way to change an organization. Either that or go work in a different organization.

[00:28:19] Johann van Tonder: Because it’s such a big question. I’ve got one last question for you, Simon. But I’ll squeeze in a quick perspective on this because it’s one that comes up quite often. 

[00:28:32] The strategy that I like to advise people to try is the same way that you would approach experimentation or CRO or product management. You would start with a customer, right?

[00:28:45] And the way you do that is you understand the world of the customer and then you figure out how you can improve the world of the customer. Use that exact same principle with your stakeholders. 

[00:28:55] So instead of trying to sell your stakeholders experimentation, have a conversation with them about what they’re struggling with, what their challenges are, what pain points are and help them figure out how they can solve those with experimentation.

[00:29:10] Then it’s an entirely different conversation. You’re no longer trying to convince somebody to do stuff. You are helping them solve it. 

[00:29:18] Simon, we’re out of time. One last question. Any recommendations for books, blogs, podcasts for people that are interested in online experimentation? 

[00:29:30] Simon Elsworth: Yeah, I mean, I’m not a huge reader, really.

[00:29:33] I tend to find that when I’m not at work, I like to disconnect from work, as I tend not to read a huge amount of kind of these textbooks.

[00:29:44] But a couple of good reads I can definitely recommend. The obvious one, Experimentation Works by Steph and Tom, is a great book, genuinely is a great book.

[00:29:56] Everybody should read that if you’re interested in experimentation. I guess you should particularly read the part about the team that I built at Sky. 

[00:30:05] I think it starts at about page 83. One thing that’s been really helpful in my current role is a book by a lady called Emily Weber, Building Successful Communities of Practice.

[00:30:21] Again, really great to understand how powerful communities can be within an organization and how to successfully build and manage management in my first few days, weeks, months at Whirlpool. 

[00:30:33] That was super interesting and super, super useful to get my head around how to connect with people. A little from left field, maybe Alchemy by Robert Sutherland is a great book. 

[00:30:46] I really enjoy the kind of behavioral science sort of vision and views that it has. I think there’s a lot of learning that people in experimentation can do by reading more into that subject.

[00:31:00] But I think if you were going to ask me one thing, like what’s the one thing that I can do to understand experimentation better, I would 100% above all else, I would say if you want to understand experimentation better, talk to people, talk to as many people as you can.

[00:31:20] I guess like the value that I’ve had from an hour chatting to somebody going through the same struggles, going through the same problems, trying to do the same things in different organizations, that’s immeasurable. 

[00:31:35] I mean, Marianne’s on next. We have spoken to Marianne a few times. Just things that really helped me understand more about what I need to do, different super opinions.

[00:31:47] So yeah, I think. Yes, read books if that’s what you’re into. Do the thing that works for you but build a network of people you can trust and learn everything you can from them. 

[00:32:01] I think that’s been a single most important thing that I’ve done in my career is kind of lean on the shoulders of all the experts that have gone before.

[00:32:13] Johann van Tonder: Thank you Simon good advice again.

Speaker

Simon Elsworth

Simon Elsworth

Global Head of Experimentation & Digital Analytics, Whirlpool

Other Suggested Sessions

Causal Inference, Experimentation and A/B Testing

Explore the depths of A/B testing, Generative AI, and Causal Inference, revealing the impact on KPIs and modern experimentation.

UX Fundamentals for More Conversions

Join Karl for a deep dive into 5 crucial UX principles and 2 transformative marketing questions, blending humor and insight to tackle persistent online business flaws.

What to do About Inconclusive Experiments

Learn how to analyze tests and handle situations with inconclusive results.