• +1 415-349-3207
  • Contact Us
  • Logout
VWO Logo VWO Logo
Dashboard
Request Demo

The High-Impact Experimentation Framework Fueling EnergySage’s Success

Learn how EnergySage achieved a 28% experiment win rate and $4M in revenue with strategic testing, financial modeling, and cross-department collaboration.

Summary

Luke Richardson, Growth Product Lead at EnergySage, and Ben Labay, CEO of Speero, discuss how strategic experimentation has driven EnergySage’s growth, resulting in a 28% win rate across 35 experiments and over $4 million in revenue. Luke shares their approach, using the "build, measure, learn" flywheel framework, combining qualitative insights, UX research, and data analysis for ideation. Financial modeling ensures alignment with revenue goals, while cross-functional collaboration—including input from product, UX, engineering, and customer success—fuels innovation.

Luke emphasizes the importance of maintaining team morale with positivity, humor, and transparency, especially when many experiments fail. They also highlight how AI tools, like VWO’s test recommendations and Heap’s generative analytics, streamline ideation and analysis.

Key Takeaways

  • Cross-functional collaboration is key to ideation and impactful experimentation.
  • Positivity, humor, and transparency are essential for maintaining team morale.
  • A strong UX focus drives user-friendly and high-performing designs.

Transcript

NOTE: This is a raw transcript and contains grammatical errors. The curated transcript will be uploaded soon.

Ben Labay: Hey, Luke, live from Budapest here. I’m excited to do this session with you for this context. Event done by VWO. I think this is my second or third time to be a part of this event.

My name’s Ben LeBay. For those joining in, the CEO of Spiro, we’re an A B testing agency and consultancy. I’m excited to chat with you, Luke, coming in from EnergySage. The spirit of these events is to kind of tell a story arc of a brand Using experimentation to grow how that’s going on.

What are the some of the successes? What are the learnings, et cetera? I’ve known Luke. I’ve known you for a couple of years now.

At least we’ve met in person a couple of times in Austin. Um, uh, chatted a ton about the program. So first. Just an intro type of question.

I’d like to get to know you from a business professional. Shed some light on your professional journey. You know, what was the story arc that, that took you to where you are today?

Luke Richardson: Sure. And yeah, I just want to say thank you, Ben, for inviting me and giving me this opportunity. I’m excited to be here live from Boston. On the other side of the world.

Um, yeah, I started my career, uh, working for Lyft as a sort of growth operations person, helping them when they were like a 30 person company. Um, so I helped them establish their driver footprint in Boston. So started with just like, I’ve been working in high growth startups, my whole career, basically, as, as well as, um, for brief period in large corporations, um, and just really love that space, love the ability to. You know, make an impact kind of where where a lot of different hats to use the cliche.

Um, so start, you started with lift, then sort of continued the trend and working in tech. But moved from more of like general sort of growth ops into more what you would call digital growth. Um, so worked for a number of different companies, started out as an SEO practitioner and, you know, had some success there and then sort of brought in my skill set from there. I think people out there who have done SEO, you typically end up falling into also analytics and conversion, right?

Because the second the traffic starts coming in, the next question from the CEO is like. How are we going to convert it? So, so yeah, I started as an SEO person and kind of got strong there. And then, yeah, naturally started getting into AB testing, analytics, um, and then even broader digital, you know, paid marketing, even, um, web development and UX.

And so that’s what’s kind of led me into this journey of what I do now, which is more of a sort of growth PM product person where I’m not just thinking about. Traffic anymore. I’m thinking about, you know, converting the acquiring traffic. What does the U.

S. look like? How are we measuring it? Sort of that entire journey. Um, but yeah, basically started, like I said, just as a scrappy growth person working for lift, um, and I’ve kind of gone through my career, uh, through various practitioner and then now leadership roles to product.

Ben Labay: Yeah. I’m real curious about the part of that where you’ve worked for both small startups as well as kind of larger enterprises. Uh, what are the differences that you see in terms of the growth approaches there? Like my personal experience with I think successful CRO people in general and then you more specifically is that I see this form of what I might call like agency, like the, the, the willingness to be proactive to connect the dots in between different disciplines.

You, you talk about being different. Wearing many hats, for example, and doing SEO, doing this and that and connecting the dots. I think for a true growth professional, like that growth function needs to do that. They need to have a lot of agency.

They need to connect the dots, but that can look very different in a small startup versus a large enterprise. So how would you comment on that?

Luke Richardson: Yeah. Two things. Um, and I guess to give a specific examples, I led testing programs at at and t in their business unit, their at t cybersecurity. I’ve led testing programs for a number of smaller, um, more startup size companies.

I think the big thing, one, typically the tool sets are different. So, you know, big enterprises might be able to justify the expense for like an optimized, like a very advanced, very high powered tool, whereas smaller companies will typically use a kind of a, a low, you know, a smaller expense tool. And so there’s just an obvious difference there. So I’ll just leave, I’ll leave that for those who have thought about it.

I’ve used both optimizely and other tools, so I kind of know the difference, but more culturally, frankly, the big thing that I’ve learned is like when you’re working for enterprise, you have to deal with, you know, legal brand risk compliance. So obviously it does slow you down. And so every test you want to run, you just want to ship, you just want to keep the velocity going. And yeah, when I worked in large enterprise, you do actually like every single test.

You actually have to pause and run that by someone and be like, can I run this? Like it’s just a copy test and half the time it’s like, no, this needs to be reviewed. And so certainly it’s, it slows you down. So I think those are just two examples of sort of pros and cons is like my particular experience.

Higher powered, more advanced tool sets, but you have to do these checks, which obviously just changes the whole dynamic of the team you’re running when, you know, whatever, every little thing is like, well, we use that word that might not be approved, you know, and so it just changes the way you think about your testing program.

Ben Labay: Can I mean, you know, the conveyor belt might have more bends and elbows and such, but can’t you get the same velocity with the, with the, you know, if I’m just shoving more into the conveyor belt? I don’t know. Well, I mean, shifting, you know, maybe on the same topic, but shifting over to energy stage more specifically, like, what does that look like? Growth program function look like they’re considering all of these nuances, um, talk about that conveyor belt, every kind of branch and who all contribute there.

And I don’t know if you want to talk a little bit about what that growth product function looks like at EnergySage.

Luke Richardson: yeah, happy to. Um, we have been running absolutely a velocity program and energy stage. So, you know, we do, we are owned by a large corporation Schneider electric, but. do have pretty strong autonomy as a business unit.

So some of those hurdles I just spoke about, we don’t actually have to deal with. You don’t have to like. Go through legal and risk for every single thing we test ever, which allows us to move pretty quick in terms of who contributes. Um, you know, technically a broad swath of people from the company contribute in terms of, like, offering insight and ideation, but the core group are, you know, to typically it’s been 2 engineers in the pod, 1 UX designer.

P. M. Which is the role I’ve played. Um, and then generally we have sort of a representation from our analyst B I team that will attend and just sort of, you know, consult on a certain data now.

But typically we can kind of get by the core team can can do sort of enough with our self serve tools. Um, so that’s sort of the core team. But what I will say is in terms of contributing to the velocity of the program, many other people within the company have been crucial. Specifically, I’ll name our customer success.

Um, unit, which we refer to as our energy advisors. Um, that group has been crucial for the growth program because they are as close to customers as it gets. So we, I include them on all of our like ideation or sort of brainstorming meetings for growth. We usually have a couple people from that team that are core reps.

Um, and I can’t tell you how many wins. We’ve, you know, been able to deliver because someone from that team saw something we were about to run. And they were like, have you thought about the fact that the customer is going to be freaked out by that word? Cause of X, Y, Z, and we’re like, no, of course not.

What? You know? So, um, they play a crucial role, but so again, the core group is your, your kind of classic product trio, right? In product UX.

Um, but yeah, customer success plays a role. Certainly folks from marketing play a pretty critical role. Um, so there’s, there’s some crowdsourcing there, but that’s, that’s the team that is. Delivering the program.

Um, and you asked me. I’m not happy to speak more about how we deliver it, but I’ll pause there and let you I’m sure you have follow

Ben Labay: Yeah, I that that’s really interesting around customer success, and I’m personally a little bit fascinated lately, especially with who contributes. What does that Todd make up? And then how is that transitioning or derivative of like the broader organizational structure? So can you tell me, like, let’s say energy stage of the past 234 years of you.

Gain, maybe in velocity or your program has evolved. Is there a certain, uh, let’s say roles or skill sets that have become increasingly important? Um, does that, does that question resonate? Maybe, maybe nothing comes along, but that’s, yeah.

Luke Richardson: Yeah. I think, um, a couple of thoughts come to mind 1. we figured out over time what the right, um, sort of growth engineer. role was that we needed.

Um, so I think when we started the program a couple of years ago, we had this idea that we were just going to kind of place an engineer or two in the pod and they were going to behave somewhat similarly to other engineers, but just move faster. Um, and we learned over time that it’s not like that. It’s, it’s a completely different role from like a course software engineer at a different pod. And so what we found with that is you really need, um, sort of a scrum master engineer and growth.

That is both doing some work to build tests, but it’s always helping de risk the launch of the test that’s supposed to go live like that day or the next day, because what I found is. Maybe half or 40 percent of all tests fail in QA in some way, or smoke testing, like something comes up that you had no idea was going to happen with the testing tool or with your code base. Something just weird that, you know, just totally everything stalls. And so what we found was we found this amazing growth engineer, um, that was just really good at de risking our releases and would just start like, Oh, this is what we’re about to run.

Let me just start like quickly trying to get something like this live in like, you know, 20 minutes just to see and quickly, she would be able to find out like, Hey, I just poked at this for 30 minutes. This is gonna fail if we do this or that. It’s like having that role has been critical. Um, and it just took us some time to learn.

That, yeah, like engineering and growth is a totally different function than engineering and a core product, but yeah,

Ben Labay: and I’ve got a particular kind of client with that exact like issue that helps me a lot. Luke actually on a little side side thing. So it’s great. I love that answer.

Let’s shift like upstream a little bit, though, you know, so you there’s a lot of people that kind of are involved or can be involved. You’ve mentioned before that your leadership approach focuses on, um, Positivity, humor, transparency. That sounds really good. So I want you to elaborate on that a little bit.

How did these values sort of kind of, um, infect or manifest themselves and velocity impact the program generally?

Luke Richardson: so when it comes to running an experimentation team, I’m sure you’ll relate to this. Most tests typically fail. And so it’s really important that you maintain positive morale. It’s really easy to get down when like, it’s just the law of numbers and averages.

Like you’re going to have a period of time when like. 10 tests in a row lose like that’s gonna happen and there’s nothing wrong with that, right? Like that’s just part of the game. And so, um, what I’ve definitely learned and it’s been kind of part of my own, you know, self growth is as the, as a leader, um, it’s really important that you just maintain, um, space for, yeah, positivity, humor, whatever.

Um, sometimes I liked, uh, I like to say like, this isn’t the emergency room. We’re doing software. It’s okay. You know what I mean?

Just like just little jokes, playful jokes to make people feel okay. Like, cause I have seen that happen before where the team will actually really start to get. Down and be like, Oh, we haven’t had a big revenue win in like three weeks. And then the team’s really starting to talk about that.

And that’s starting to really bubble up. And what I’ve found is that’s not productive. Like the growth teams always has urgency. They, they always have, they’re always aware of the remit of the team and the ability for impact.

Um, so I feel like my role is to not further stress that and further augment the tension. My role is to be like, Hey, let’s keep. The sort of natural, like, ideation and creativity, um, in collaboration going and again, I think the critical thing is to keep people focused that, like, tests are gonna lose. We’re still learning.

You always get learning value from a test every single time. So like, there’s something there that I’ve, I’ve had to learn and again, it, it wasn’t, I wasn’t perfect. I wasn’t good at it at first. I, there were times when I would myself get bummed about a losing streak and I had to learn like, okay, I’m happy.

Think about the impact I’m having on the team, right. Um, by, by acting that way. And so that’s just been something I’ve had to kind of adjust. But yeah, that’s, that’s been very effective for me.

It’s again, especially when you’re going through quote unquote losing streaks. Yeah.

Ben Labay: me of a couple of things. Like one, it’s kind of the middle of Maslow’s. Hierarchy of needs of this, the safety, psychological safety, where, you know, you need that kind of foundational piece before you get what’s at the top, which is kind of the ability to have the agency to be kind of self critical, but leaning in and engaged and things like that. Also, I think makes me think about Seth Gooden.

He’s got some great quotes on this exact topic on his book, the practice in particular, where he’s got a quote quote where he talks. He says, like, if you ask. Somebody that doesn’t have a lot of good ideas. How many ideas have they had that failed?

They will say, well, not many at all. But if you ask somebody that has a lot of really good ideas, how many ideas that they’ve had that fails, they’ll have a ton of really failed ideas and mistakes and failures, et cetera. So you just if you want good ideas, you have to have a lot of bad ideas, like, and eventually a good idea will come out of it. Seth Gooden’s got some really good quotes on that.

Um, yeah. Yeah, cool. So I think I want to shift into a little bit more mechanics or kind of like get a little bit more, um, granular in my, my question set here. Uh, you know, talking, talking a little bit about the flywheel.

You talked about the roles. We talked about kind of who’s in the pod, a little bit of the. Leadership approach, or at least some of the values and culture there, um, talk to me a little bit about the mechanics of your flywheel and more specifics, if you will, eventually going towards, like, some case study stuff that we can, like, what has come out of that flywheel, you’ve had some really good success, and we’ll get to that here in a second, but just the mechanics of that flywheel, I don’t know if you can share any stuff on velocity or anything like that, but yeah, The way that I look at it, just to give you my, my context of this is the, there’s a lot of ingredients for experimentation.

You’ve got people roles. You’ve got, um, the mechanics of the tools and the data. Of course, you’ve got, um, the independent data, like the customer behavior and what’s going on there. Um, but the recipe, so to speak, what you cook from these ingredients is that flywheel.

Right. And the flywheel can come in. You know, you can cook out a different type of recipe and depending on your set of ingredients. But ultimately, what that recipe is providing is velocity of learnings and impact of learnings, right?

So these are the kind of the two flavors, if you will, that might vary from that recipe. Um, so yeah, so describe to me your your flywheel in as much detail as you want, and maybe you can use some of that language that I that I just threw out to you.

Luke Richardson: Sure. Yeah. So I think about the flywheel. Um, I like to structure and sort of just build, measure, learn that classic, um, effect.

And so that’s the flywheel we use. Um, and so a couple of things, so I’ll start with like where the ideas come from. That’s where you typically people ask me, you know, where do you start? Um, a couple of things that have been really effective for us.

I’d say the. Number one thing I’ve already kind of alluded to it is, um, you know, kind of qualitative patterns that were shared with us directly, either from, you know, kind of, uh, firsthand accounts from customer support. Um, or directly listening to customer interviews. So like we, we meet with those teams regularly and we look for patterns and they come up, um, typically around like things customers are afraid or worried about, right, is a really good opportunity to think about zero.

So that’s 1 area that ideas come. Another 1 is, um, just sort of UX pattern research. So just looking at. You know, well respected analogs in in our space are similar to what we do, which is we’re a two sided marketplace.

There’s plenty of great analogs out there, so we study patterns and again, just just look for, um, obvious overlap. And, you know, six sign up flows are all doing this one thing. We should probably try some version of that, that kind of thing. And quant.

Um, so we use heap. For our product analytics. Um, and so we do a lot of, you know, just like obvious, like funnel analysis, effort analysis, sort of where is the attrition happening? Weird drop offs that you wouldn’t expect for certain segments of just getting into the data, which is more of the role that I play.

Um, so those are the three things that bring forth the ideas typically, um, that allow us to, like, put together a long road map. Um, I would say what’s critical for us is we’ve found a really effective prioritization approach so that that starts to get into the build. Um, we do use rice, which I would imagine many people listening to this are familiar with, um, with, you know, reach impact confidence effort. We’ve, we’ve adjusted a little bit.

We’ve kind of changed the eye a little bit. Um, so we use instead of, uh, what I think traditionally used for, uh, we, we just use like revenue baseline because with CRO, it’s all about the baseline, right? In terms of the, um, upward size of price. So anyways, that’s what we use, um, for prioritization.

So again, just applying the classic rice framework that people have used that. And then when it comes to measure and learn, I think what I’ve found there again, so it’s like. You start with these with this methodology of finding the ideas that’s proven you use kind of proven prioritization frameworks to ensure that you’re biasing towards your highest reach testing frontiers, right? So you’re, you know, you spending your time wisely, basically, um, and then we use these sort of crowd source, uh, collaboration.

Meetings. We call it test review and ideation. We have a bi weekly meeting where we include folks from all over the company. We use those meetings to both review recent tests and talk about what we want to run next.

And so that’s a really important part of the process as well. That’s that’s where a lot of amazing ideas come in. When you get what I found, if you get 10 people in a room every now, every now and again, That have quite different backgrounds, you’re gonna be smarter, um, as a testing program always because obviously the sort of homogenous group think that will start to develop even in a small growth team gets, you know, it gets sort of stress tested when now you’ve got someone from customer success and another different developer and someone from marketing, whatever, all in the same room, you just more ideas come out.

So anyways, those are some of the ways we do ideation in terms of, you know, Execution just to really complete the question. Um, we, you know, we’ve done a good job, I think, of not over installing process like the rest of the pods and our company use kind of classic agile scrum, two week sprints, all that stuff. And we found that we don’t need to do that. We really operate on a one week con bond sort of thing like.

On monday, what do we want to deliver by friday? That’s more how the team is operating. We’re not doing two weeks, months out. We found that it’s kind of a fool’s errand in my experience to try to plan way out, you know, because based on three days from now, we may have learnings from tests.

They’re in market right now that fundamentally change that whole plan. And so we’ve really tried to instate agility We’re very, very agile. Um, so yeah, we use this combine method. And again, the whole goal of the program is velocity.

Like if we can stick to those, uh, frameworks that I just mentioned, like we’re not just throwing things at the wall. Like we’re prioritizing, we’re taking things through kind of product fundamentals. But if we can do that with speed, the general instinct is ship it, just run it, like start it. Let’s, you know, we’ll come back in two days if we have something better, we’ll, we’ll turn it off.

But like, let’s just, let’s get, let’s get tests out. Um, as long as they meet kind of the standards of the team. And so yeah, those are some of the things we do in terms of Execution. I mean, I don’t want to go too much deeper because it’s I could speak for hours on this, but those are just some of the things we do in terms in terms of our build, measure, learn a flywheel.

Ben Labay: I want to do a quick lightning round like on top of that. Cause like I, you went, you went to that and I was like, Oh, I got a question. there. So thematics, you go in, you, you find custom, you can collect a bunch of data from different things.

UX best practices, talking to customers, observing problems, et cetera. Do you theme any of your problems and then structure kind of lines of tests out of those problems? And it could be as simple as like tagging. Uh, sets of tests to a particular problem theme that you focus on for a quarter or something like that.

Luke Richardson: Yeah, absolutely. Um, so certainly one basic tagging we’ve done before is addition versus suppression testing themes. Um, so that’s just sometimes we’ll, we’ll, we’ll literally say like this quarter, let’s try to make suppression testing a point of emphasis. Because we are aware that, like, we’re hearing, we’re seeing evidence of UX overload, you know what I mean?

So, like, when we start to see that either from Qual, a qualitative, um, feedback, or even just Quan, like, you’re seeing, like, a lot of things are not being clicked that are primary CTAs or whatever. Like, you start to get indications, like, might be some disorders. That’s one thing I’ve seen done before, um, Other themes I would say are, um, you know, we, we, we’ve used, like, the fear, urgency and doubt

Ben Labay: Mm.

Luke Richardson: before, right?

Ben Labay: Yeah. Yeah.

Luke Richardson: that’s just sort of classic motivators. So, you know, Yeah, we’ve, there have been times we’ve been like, Oh, let’s, let’s, try like a phone, a fear of missing out theme this quarter. Like that’s really working, you know, that’s really working for the paid team for their messaging. Let’s try that further down funnel.

And so those are a couple examples.

Ben Labay: Not okay. It’s the next one on the prioritization side of things. You mentioned kind of the rice and like where you’re putting your impact, uh, around a financial model. And I’m actually looking to do that too.

We use a PXL. Uh, we do a lot of kind of numeric based to give us kind of an ordinal prioritization framework. But I’m starting to really push on a couple Guinea pig programs to have revenue, like, and, and to be the prioritization and in fact, on, on the, like the confidence and ease side of the equation to have that level of effort. Be a normalizing factor of the financial, you know, dollar amount.

And so, for example, if you, if this is worth a hundred thousand dollars, but it’s a medium level effort, you multiply, it’s a multiplier of like 0. 8. And so that test is only worth 80, 000, where if it’s something that’s faster to market, it will be a multiplier of one. And so in the end you get one column, one metric.

It is money to focus on. I’m, I’m, I’m exploring that a little bit. My question to you is. Do you?

I mean, you’ve got to have a financial model there, right? Do you have kind of a financial analyst? Do you talk to finance a little bit? What’s the relationship there?

Or maybe you’ve just built kind of a napkin math financial model to help you run your program, and you don’t talk to finance at all. Yeah, talk to me about

Luke Richardson: definitely, yeah, I definitely talked to finance. Um, I sit next to our head of BI, um, who actually doesn’t. Technically reporting to finance, but like in what you’re asking, he is really the sort of representative source of truth for like what are sort of financial data is from a testing standpoint. Um, yeah, so he and I are quite aligned on what the appropriate way is to, um, sort of translate.

A, you know, funnel lifts at this level of dilution or impact. And what does that actually mean for the business in our business? Because, you know, we are, uh, in a two sided marketplace where we’re effectively generating leads for our, um, supplier partners, you know, it, it’s, it’s quite simple to just focus on like, okay, if we got this many more. Approved approved properties into because we’re, we’re mostly helping people go solar.

We know that there’s, I’m not going to use, I’m not going to use the financials, but there’s a certain dollar amount that that new net new approved property is worth to us. So it’s very easy to be like this test clearly gained us about, you know, 75 net new whatever properties per month. We can quickly do some quick math to get to an annualized revenue estimates. That’s sort of the foundation of how what we’re doing for all that.

Um, and then, yeah, we, we’ve just injected that into rice so that the, um, the core revenue impact. Is weighted at the same level as the traffic reach the, you know, we, we call it the story point level of effort, but the, you know, the kind of. The engineering days level of effort, et cetera, and the confidence.

Ben Labay: Yeah. Nice. Nice. Well, can you share any particular success stories kind of coming out of the flywheel?

Right? So my little rapid fire question on top of that’s over. So it’s coming. What’s what’s spitting out of the flywheel?

Any kind of surprising experiments that you’ve run recently? Any kind of key learnings and stuff that have come? What can you share about that?

Luke Richardson: Um, one test that we recently ran, um, that I was particularly proud of, that was really cool. Uh, so in our signup flow, we have, had always had like a thank you page at the very end after you. Complete your, um, you know, basically registration. You’re creating your account.

We always had this thank you page, like the, you know, what I’m showing. And honestly, the team was not proud of the page at all. Like there was a sort of consensus that we should be able to beat this page. You know, it’s a lot of text.

It’s not particularly motivating. Um, we’d never really found a design concept that was compelling enough to replace this. Um, and we found through some design pattern research, I’m not going to say the specific analogs, but we found a pattern that a lot of analogs were using, um, that I’ll show you right now. Um, this concept of sort of a blurred.

Uh, you know, elusive reward is really what it is, right? Um, so instead of just having text on white screen, the thing that they’re trying to do in our case is they want to finish their signups. They can get the quotes from, from local contractors. Um, and so this was a massive win.

Um, I won’t share financials, 20 plus conversion lift to our funnel. Um, it’s a really big win. And, um, one that the team was particularly proud of because it’s always fun when you have a win that a, it worked and like, it was a quantitative win, but be the UX is something the team is more proud of. And this is that case.

Like the team was very proud of this UX. It was very proud to bring this to market because it just, frankly, it looks better and looks sleeker and is better for our brand anyways. Um, so this is one example, a recent example I’m proud of, um, that, yeah, it was just like refreshingly. The hypothesis we had, the research we did was just exactly right, and that is so often not the case, but in this case it was.

Ben Labay: Nice. Nice. Well, cool. Um, yeah. So looking forward or thinking about forward. So we kind of went through the background that we went to, like what you’re up to now, what you spit out, but looking forward, what advice would you give to companies maybe just starting to build out a culture of experimentation and build out experimentation or maybe trying to scale experimentation.

They’ve got kind of centralized small team. They’re looking to, like, add more to it. Eventually getting to those, um, You know, ultimate, uh, outputs of velocity impact, et cetera. And this might loop us back to like what we talked a little bit about in the beginning around psychological safety and such.

But I just wanted to ask it, um, real explicitly. So what advice would you give?

Luke Richardson: Yeah, a couple things. Um, and it’s gonna sound it might sound a little cliche, but it is important to first and foremost establish a culture of testing. And what I mean by that is, if you want to run a high velocity testing program, people need to broadly understand what testing is and how it works. Like, someone should be going to all hands.

Or posting in your, you know, general Slack channel or whatever it is, um, to help people understand like, here’s what we’re doing. If you hear us say statistical significant, like this is what we mean by that. This is what, like, here are, here is the magnitude of impact to the business. If we achieve like this small of a lift in revenue, like just help people understand that this isn’t just like some nerds in a room.

Doing science for fun. It’s like this can be a massive lever for this business and for the trajectories. I think that’s step one because I think that’s I don’t know that that’s always understood, especially if you’re working in a totally different function and sales or whatever. Why would you know what experimentation is?

It’s a very mystified concept. So anyways, that’s kind of step one. It’s just like, does everyone know that this is really important? Does everyone know what we’re doing and what the basic terminologies are?

Not like crazy nerdy, but just like. Statistical snippet. Just that one term is pretty important. People understand.

That’s sort of step one. So that you start to gain buy in. And what happens when you do that effectively is people will actually, that’s what I experienced, they’ll reach out to you and be like, Hey, can I attend your test review meeting? Like I would, I think it’s super cool what you guys are doing.

Can I get involved? And you know, they’re just like those moments. And so that’s, that’s one thought I would have. Um, another one I would say, you know, it is, I do recommend, um, including some product fundamentals.

In a testing program, I know that not all zero functions use product management fundamentally, and that’s fine. I don’t there’s no like right perfect way to do it, but we’ve spoken a little bit about prioritization, right? We’ve spoken a little bit about sort of, uh, methodologies for build, measure, learn. I do think having some nature of organization and feedback loop is important.

And so, um, and that could be tricky because testing is all about speed, wins, velocity, agility. So it can, it can feel regressive almost to slow down and like, do some sort of structure. But I think what we’ve been talking about today, especially with prioritization and really making sure that. You’re doing some sort of weighted data driven, revenue driven approach to that is critical.

So I guess those are just two, two thoughts. I could, again, speak about this for hours, but, um, make sure that the company understands what you’re doing so you can get that momentum, light that fire. And then second would be don’t, don’t skip on the fundamentals. Like don’t, don’t just fall into the trap where you’re literally throwing things at a wall.

And you can’t really explain why you’re doing what you’re doing. Cause that, that won’t scale. It won’t, it won’t sustain. It might, you might get a couple of wins and then, you know, it’ll fade, fade, fade to darkness.

Eventually.

Ben Labay: Yeah, nice. I kind of want to answer this question myself because I’m not here just to interview him. And I

Luke Richardson: do. Yeah,

Ben Labay: I do. I do get the benefit of you doing it first. So I get to kind of like reflect on your answer. So I think I, you know, my advice for companies just starting out or really trying to ramp up and scale their experimentation programs and culture is sort of two fold from the top getting alignment from leadership.

And what that means tactically is Awareness of leadership’s role in being a coach and a leader and providing data to the teams below them. And so having it be their role to get the data for teams below them who are on the front lines working with customers to make decisions autonomously. Um, right. So that’s like.

Tactically, that’s what why I mean by alignment from the top. So it’s not even that they need to understand experimentation, but they need to understand the need to get data to people on the front lines making decisions to, uh, from the bottom up. I think you can. You know, change culture from, from the bottom.

And I think what really helps is having a focus on, um, two sides of the same coin, it’s being customer obsessed and thinking about what are the customer insights and problems and opportunities that you’re focused on and being very concise and clear, having clear customer insights, statements, et cetera, but pairing those with the financial. Model and the money because that’s a Rosetta stone to be able to talk across the org like you said, um, and so we’re customers Obsessed. We focus on customer problems, but we translate that. I’ve been looking at that through a quantitative and qualitative lens.

The quantitative is the financial model, but the qualitative is the statement of this is how customers behaving are perceiving now, and this is how we would wish that they were behaving. Behaved or perceived, right? There’s, there’s gotta be clear, qualitative statements, but it’s gotta be paired with that finance. So just to repeat myself a few times there for emphasis, uh, cool.

So yeah, looking ahead, um, what are you most excited about? Like looking forward with your program, with your career, with your work? I mean, there’s AI coming in here to assist that flywheel. What are you digging into now?

What would you. Excited to learn. Um, um, yeah, yeah. Where are you? What are you learning about right now?

Luke Richardson: Yeah, I mean, everyone’s talking about I’ll speak about I briefly. Um, two of the primary tools we use have introduced a I directly into the tool set. So these are mhm. Heap for product analytics and VWO for testing.

They both have some version of AI. So VWO has started doing, um, you know, AI driven test recommendations. I don’t know if you’ve played around with that. We have started using that.

Um, and I found what’s really empowering about that, which is really fun is, um, when new members, like when new engineers or UX people like rotate onto the team, because we’ve often done rotations and growth, it’s a really fun way to just let an engineer just go poke at that a little bit. And it’s a fun way for them to quickly get comfortable with what like test ideation looks like just playing with the A. I. Um, and then on the analytics side, I will say, and no one’s paying me to do this, but I’m going to plug keep generative.

A. I. Has been a game changer for our program. It’s unbelievable. I’m being able to in a second.

Just be able to prompt keep and say, like, Hey, like, what is the conversion rate for this traffic segment in the past 14 days? Please compare this to the same 14 day window 6 weeks ago like that. And it is I have a chart in 5 seconds. I mean, game changer because you can do the analysis in the meeting, like when you’re in the.

Stand up because it’s not. Whereas without that, you won’t do that because it’s gonna be a rabbit hole. You’re gonna have to be like, well, let me build a quick analysis and everyone’s pause and you lose momentum. So anyways, those are two things that, um, I’m really excited about.

And so, yeah, I have been spending much more time, um, getting comfortable using both of those tool as well, you know, as well as broader. Um, so I think I’m excited about that. And then I guess I’ll just say, um, more recently I’ve really been trying to brush up on my, um, UX skills. Um, you know, I work with phenomenal UX partners, so it’s, that’s not something that’s not on me, obviously, like as the product person, it’s not on me to deliver ux, but I’m interested in UX and working with really smart UX people has, has helped me start to, to see patterns and, and, and kind of just have that eye for things that I definitely didn’t have.

A few years ago. And so I think that’s an area of my skill set that I am actively trying to improve and kind of round out because it’s fun and a lot of our best wins have come from these, you know, UX pattern recognition. So those are a couple of couple things. Yeah, that I’m focusing on.

Ben Labay: Nice. Yeah, I’m excited about the AI front as well. I mean, I, uh, ideation sessions like data meta analysis across test and patterning and things like that. I think it’s key.

I love your example. We’ll get that clip to them to use in a testimony. I’m sure. Uh, so we’ll go.

Finally, a tradition in these, um, convex. VWO sessions. What books are you currently reading? Uh, if you’re not a book person, what series, maybe like some of the resources for UX learnings or something along those lines.

Yeah.

Luke Richardson: Yeah, I am reading a book right now that I feel like is just a classic. For, for, for, you know, people that work in tech, I’m reading, I’m thinking fast and slow, um, which I’ve been hearing about, you know, been, people have been telling me to read that for like 10 years or whatever, five years. Um, so that’s been fun. Um, and more just playful on the TV series front.

I am loving the new, um, house, house of dragon. Season two, huge shout out. If you haven’t seen it, I think it’s way better than season one. I think it’s better than some of the, the original game of thrones seasons.

It’s really good. Um,

Ben Labay: Yeah. Nice.

Luke Richardson: I don’t know. Those are two. How about you? I want your answer. How about you?

Ben Labay: I actually have watched a couple of the first episodes of the new season. Uh, so I need to get deeper and it’s been a while. I mean, the Olympics I’m watching right now, a ton, really, really loving a lot of Olympic stuff on the, you know, back to the, You know, work side of things. I’m reading product operations by Melissa Perry.

Uh, product product ops is like a three, four year old kind of newish role in the product world. I think that it could be said that experimentation is a products ops skillset. Um, so I’m really curious. I’m kind of fascinated by how experimentation can be injected into organizational flywheels.

And so that’s from a product standpoint, but, you know, analogously to marketing, marketing ops, there’s like creative ops and there’s everything ops these days, including experimentation ops. So I’m kind of fascinated by all of those overlaps. Cool. Well, man, thanks.

Thanks, Luke. Appreciate the interview. I think that’s it. That covers us really well. Um, again, thanks all for listening in to this VWO Convex session.

Thank you, Luke, for joining us. I’m Ben LeBay from Spiro. And that’s my last word. Luke, any last parting words?

Luke Richardson: I’ll just echo. Thank you for those that, that, uh, listened in. Hopefully you learned a thing or two, and if not, uh, we, we enjoyed, you know, enjoyed spending some time with you.

Ben Labay: Thanks all.

Speaker

Luke Richardson

Luke Richardson

Staff Product Manager, Growth, EnergySage

Ben Labay

Ben Labay

Research Director, Speero

Other Suggested Sessions

Beyond Basics: Addressing Complex Challenges in Experimentation

Join Dr. Ozbay to master online experimentation: managing multiple tests, analyzing complex data, and tailoring strategies for team success.

[Workshop] Create a Data-Driven Experimentation Program With Speero's Research Playbook

Unlock Speero's B2B experimentation secrets with an interactive workshop, turning research into test hypotheses, plus all the tools you need to DIY!

Crafting Personalized Journeys at 35,000 Feet

EasyJet's Personalization Playbook: Vicky Routley and Chris Gibbins reveal how agile experimentation transforms customer experience in airline marketing.