VWO Logo Partner Logo
Follow us and stay on top of everything CRO
Webinar

Foundational Bottlenecks in Creating Experimentation Culture (and How to Overcome Them)

Duration - 40 minutes
Speakers
Arash Vakil

Arash Vakil

Startup Advisor & Tech Investor

Vipul Bansal

Vipul Bansal

Group Marketing Manager

Key Takeaways

  • Prioritize your hypotheses using the ICE methodology (Impact, Confidence, Ease). This will help you rank your ideas and determine which ones to test first.
  • If you're not confident in the data you're gathering from your tests, pause everything and ensure the integrity of your data. Don't take action on data that could be incorrect.
  • Make sure you're tracking the right data elements to reach the correct conclusion. Your data must be accurate before you can move on to A/B testing.
  • To onboard operational team members to an online experimentation platform, consider demonstrating the benefits and ease of online testing compared to offline methods.
  • If you're unsure about how to conduct offline experiments online, don't hesitate to reach out to experts or use professional networks to gain more insights.

Summary of the session

The webinar, hosted by Vipul from VWO, featured Arash Vakil, Founder of Disrupt With Us, who shared his expertise on overcoming challenges in creating an experimentation culture within organizations. Arash, with his extensive experience in B2B and B2C verticals, discussed the importance of A/B testing and the fear of being wrong which often hinders its implementation. 

He emphasized the need for a cross-functional team to generate diverse ideas and the importance of starting small to build momentum. The webinar provided valuable insights into data-driven decision-making, hypothesis generation, and the importance of testing for successful business strategies.

Webinar Video

Webinar Deck

Top questions asked by the audience

  • I just wanted to understand from you since you've been a C-level executive for quite a long time now. So what is culture really? And why is it so hard to build anyway?

    - by Vipul
    Yeah. I mean, culture is a very, very big thing. Right? And there are a number of ways you can kinda answer, and it really starts from you know, the top and how that individual leads and how you opera ...te, that those are all certainly things that play a role in your culture itself. So how your workflows, are part of your culture, in some organizations, they are so design-focused that you can't get off on trying to put out a test that is not a beautiful design. They are so design-focused that they need to go through design cycles. So culture varies from, organization to organization. It's super hard to build the right one that works for your organization, but over time, the most successful organizations figure that out.
  • What cross-functional teams do you recommend having in your working group, and how many is too many?

    - by Darren Russo
    Yeah. So one of the things I would kinda recommend is having a small pod of teams. It's a small group of teams to at least get things started. Right? And so you may want a representative from marketin ...g. You may want a representative from engineering. Because you'll be able to bounce off these ideas very quickly in real-time. How feasible is this? Is this test too much for us to take on? Can we do it? Do we have the data to support the test? So I would think about, you know, a representative from, you know, some major organizations. So probably about 3 to 4 people, as kind of like the brain trust to at least get that they started and you could theoretically replicate that based on different products that you have within your organization.
  • How do you proceed with the hypothesis? Like, what are the key elements in the research? If I see a problem I think I need to fix, and I need to establish a hypothesis, what key elements I should put in my research before I decide what hypothesis I should have?

    - by Akash
    Yeah. So thank you for your question. So the question is basically about, okay, we've got a whole bunch of hypotheses, you know, what data do we need? And also, you know, which one should we kind of s ...tart with? So I didn't touch on this. It's prioritization basically, the next step after gathering your hypothesis because what you're going to want to do is basically rank these different ideas from in their many frameworks for this. I'll just use the ice methodology for now. Impact, confidence, and ease. So the impact is how impactful do you think that this test is going to be, and confidence, how confident are you in its ability to kinda move the needle of the business? And, you know, the e is ease, how easy is it to kind of, get this test going? So you'll want to rank them based on that. And then you'll be able to have a ranked order of things that you should attack. I hope that answers your question.
  • What do you do if you're not confident in the data that you are gathering from all the tests that you're running or maybe all the insights that you're gathering?

    - by Vipul
    Yeah. So certainly came across this in multiple teams. The first step is to pause everything you're doing and get that data correct because no matter what A/B test result that you put out, it will be ...questioned. It will be potentially wrong, and you don't wanna take action on data that could be incorrect. So you have to ensure that the integrity of the data is valid And make sure that what you are testing, you track those data elements in order to reach the correct conclusion. So, certainly, you've gotta get your data down, first, in order to kind of move on to A/B testing.
  • Oftentimes teams are interdisciplinary and formed with different roles. But do you think that the members of these teams should necessarily share a common technical knowledge about this? How can one go about to repeat the correct way of applying a test?

    - by Marcela
    Awesome question. Absolutely not. It is not required. Right? And again, it kinda goes back to what I mentioned earlier about having a diverse team from multiple groups. You want the techies. You want ... the non-techies, because of your input, you're coming at the same problem. You're gonna try to solve the same problem in multiple directions. And so for someone who's not technical, you make try to come up with, and I'm sorry to say this, a more human, method of solving that problem. And that might be through copy changes. It might be through a design element, whereas someone on the engineering side, the very simplistic answer here may want to see if, you know, they could speed up the transitions between the stages to kind of push people through your registration process, for instance. So to answer your question, it is not required to have a technical background when you're forming this kinda cross-functional group.
  • What would you suggest, should the team of experimentation, should they be spread across the world? Or should there be separate team for each region?

    - by Vipul
    Yeah. So I think it certainly depends on the organization. What I have seen are pods, basically, that attack cross that attack it on a cross-functional basis. As well as teams that just attack mobile, ... or just the web. Right? So the reason why is that you're going to have domain expertise potentially, within that specific group that might be you know, really important and, very insightful in terms of developing those tests or may have the historical knowledge in terms of okay, why is it like that? And this kinda goes back to you know, the Trello board that I kinda mentioned, right, but you gotta put things in a repository put as much data as you can through there, that'll help you refine your hypothesis, moving forward to develop better and potentially more impactful hypotheses.

Transcription

Disclaimer- Please be aware that the content below is computer-generated, so kindly disregard any potential errors or shortcomings.

Vipul from VWO: Hey, everyone. Thank you so much for joining in for this VWO webinar. I really hope you and your family are safe inside your respective homes, and I wish you all good health. My name is Vipul, and I’m the Marketing Manager at ...
VWO. I’ll be your moderator for today.

For those who are hearing about VWO for the first time, VWO helps you identify leaks in your conversion funnel and provides strategies to fix those leaks and keep your revenue growing. With me, today is, who runs a company called Disrupt With Us. The name is in front of you. He comes with over 15 years of experience in B2B And B2C verticals, Arash is also a product strategy consultant, startup adviser, and tech investor. Quite a lot of work in your hands, Arash. I’m excited to have you here today.

 

Arash Vakil:

Likewise. Happy to be here.

 

Vipul:

That’s great. So before guys, I let Arash start with his presentation, I just request all of you to ask any questions that you might have during the course of this presentation using the Go-to Webinar questions panel. With that Arash, let’s begin.

 

Arash:

Alright. So welcome to start talking. Thanks so much for being here today. I hope everyone is doing well and staying safe during these really, really weird times. So, our talk today, foundational bottlenecks and creating an experimentation culture and how to overcome them.

Our goals for the next 30 minutes, I’ll give you a little bit of background about us and our firm. We’ll go into what it needs to create an experimentation culture, what works, what doesn’t, what you’ll generally come across in trying to create one, and how to succeed despite those obstacles. As you may have already seen, our firm helps organizations by investing advising, and building alongside companies at various stages in their life cycle. Some of the areas we help organizations with are conversion rate optimization as a path towards growth, SEO, and ASO to help your apps gain additional traffic through increased brand exposure, and we also help companies both augment and support their existing teams. If you wanna chat about any of the challenges around those areas, we’ve come across them. So feel free to reach out and chat with us. At the very least, we’ll trade some war stories. So okay. So what does it mean to have a culture of experimentation? First, let’s cover what it certainly is not.

It’s not pinky in the brain. It’s not two people sitting in the dark scheming and trying to take over the world. And although it’s extremely powerful, it’s definitely not magic. A culture of experimentation consists of a methodical process where we generate and test hypotheses and then respond to the data to guide the next steps in our decision-making process. We do this because we know that experimentation leads to innovation. It is a process as old as time itself.

But how do we get to this promised land? And most importantly, why is there so much resistance? Let’s look at a typical org. No matter the size of your organization, you’ve got some level of hierarchy going on, someone that’s setting the company’s missions and goals, and those that are executing against the mission. And within here, you’ve got department heads that are responsible for their piece of the puzzle whether in product, marketing, or engineering.

We’ll get into the challenges with each and some potential strategies that may help you win over those department heads. But first, let’s talk about some of the common themes I’ve seen why leaders and organizations may be hesitant to buy into an experimentation culture. And we’ll break them down. I’ve spoken with dozens of senior execs, and I’ve come up with a few common themes. First, cost.

We hear this too often. It costs too much. We can’t afford it. We should be spending our resources against executing the mission that I’ve set. Okay.

Next time. We simply don’t have the time. We always don’t have the staff to be able to do this. We’re running a type ship. Okay.

Last ego. Why bother? I know what we should do. So let’s break this down starting with costs. Today’s tools are similarly priced to other SaaS tools that are considered to be mission-critical.

And depending on how it’s deployed, it can act as another mission-critical tool that protects the business. We’ll touch on this in a bit. Now if there are absolutely zero budget-free tools exist to get you moving in the right direction, hopefully from there, you can use that momentum to upgrade your tools to more feature-rich testing software like VWO. Plus, the cost can certainly be covered by the potential impact of even the smallest of gains, which you get to lock in for the life of your product.

A seemingly small 2% gain is not a one-time deal. And the flip side is true as well. What if you could measure a change that caused your metrics to decline by 1%? Instead of it being rolled out and the changes going undetected and locked in for the life of the product, you’ve just prevented the full deployment, which in turn improves the company’s metrics ever so slightly. These scenarios, although seemingly small at times, become too large of an opportunity to pass up.

Let’s move on to reason number 2, time. Today’s tools actually save time by allowing teams to conduct and execute tests without involving engineering teams. In fact, your partners in engineering can continue to work on the deliverables without getting distracted because these tools allow for some fairly sophisticated changes that can be made by non-engineering teams. So we’ve covered costs and time and now ego. We touched on this a bit ago, but let’s dive deeper.

So you’ve been told, why are we testing it when I know it’s going to work? Here’s where your CEO is coming from. They care deeply about the business. We can all appreciate that. You’re going to want to demonstrate it is out of respect for the business and out of an abundance of caution that you want to recommend that we test it first.

You want to protect the business. That’s a message most can resonate with. Next, business metrics can change due to market conditions, which could hide the true impact of such changes in creating a false positive or negative impact that would otherwise go undetected and locked into your product metrics. Let’s take a look at a couple of more scenarios. Here’s your CMO who is very sure that changing the copy on our home page will lead to massive improvements in our business metrics.

We’ve all heard this before. Here’s an opportunity for you to use their initiative to help feed yours to gain common ground and support. When A/B testing isn’t ingrained into your company’s DNA, you’re going to want to use any opportunity to bring testing into the discussion. Many of us on this call know that A/B testing isn’t all about the win. But it’s also about risk mitigation, not only validating wins but also protecting the business from poor decisions.

Last, let’s move to another member in our C-Suite, the CFO. Here’s another scenario they might come up against the completely arbitrary sale that some of you may be running today. What you may not know is that they might be under a great deal of pressure from their board or their CEO to increase sales before quarter end, but at what cost? In this scenario, you might be thinking, are we sacrificing long-term revenue for a short-term game? Can we help your CFO justify a new data-driven strategy to protect and grow your revenue streams?

Perhaps we can prove through A/B testing that discounting isn’t our best option, and there’s a more innovative way to unlock revenue growth. Perhaps through product improvements or copy changes. So here we are, some common reasons as to why you might get some pushback but there’s an overarching theme here. People just do not want to be wrong because if they’re wrong for some people, their feelings and ego will be shattered. And lastly, probably most importantly, your ideas can’t be wrong if you don’t test them.

Think about it. It’s a diabolically genius move. I know I get it. It’s frustrating. We’ve been there.

So now everyone’s all depressed because I’ve listed a bunch of problems you probably faced. But don’t worry. I’ve had some success working with teams in building their A/B testing strategy for wide adoption and approval. And one that sets them up and their organization for success. The first thing you’re going to want to do is assemble your team.

This isn’t a solo mission, and you’ll find greater success in numbers. Do you guys remember offices and human interaction that was so 2019? So this picture doesn’t seem quite right. Ah, yes. Much more 2020.

So what are some key areas you should be thinking about when creating your team? 1st, your goal is to build a broad base of support across your organization. A groundswell, if you will. There are two main reasons here. 1, the cross-functional group will enable you to have multiple points of influence up and down the org chart.

2, you’ll need all the ideas you can get. Everyone is in different stages of their life. With different priorities. So naturally, they use different products. Let’s take a look at a quick example.

Here’s a look at some of today’s top-grossing apps in the store today. These apps are in the dating, music, entertainment, and casual gaming categories, all with different retention and monetization methods that could be applied to your business when you’re generating a hypothesis. So be sure to tear down any walls between different functional areas and build alliances with your colleagues in marketing, sales, product, and engineering. This will only help your chances of getting the texting momentum that you need. Next, insights.

So now that you’ve got your team in place, you have to understand where your product is today in order to know where you’re going tomorrow. First things first. Take a close look at your data and look for areas of improvement. Map it out. For example, Are you seeing there’s a drop-off within your registration process?

What steps might we take to reduce that? Start gatherings and soliciting fees within your organization. You’re looking to get buy-in and alignment on the problems that need solving. But where do you start? This brings us to step 3.

Start small. The overarching theme here is that you shouldn’t bite off more than you can chew. For a few reasons. Your goal is to build momentum as soon as possible. Going for the big wins may be tempting, but it’s only going to increase anxiety within the organization.

It may also require more approval reviews if you’re going after pages that are claimed by someone else, for instance. Last will want to work up to testing in the checkout flow because naturally, you’ll have far fewer visitors at this point in your funnel. And it will just take longer to get to statistical significance. Even for high-traffic sites, it could take weeks, if not months, to reach a decision in a checkout flow. So now I recommend that you work on non-controversial items higher up in the funnel.

That might mean you’re working on less glamorous areas, but we need to win, and we need to build momentum. So think about some areas that might indirectly impact your business, such as driving more visits to the Contact Us form, getting more users to start filling out a form, push more users to start the checkout process by adding a call to action on a page unrelated to your normal flow. So now that you’ve decided on a test, let’s move on to the last and final step and really the most critical step. Show and tell. Often, I can’t stress this enough.

You want to be as open and as collaborative as possible to get broad buy-in and broad support up and down the chain. I’ve seen way too many teams work in the dark on their A/B tests. So what steps could we take to avoid that? You wanna start from the beginning. Listening to ideas publicly, perhaps via Slack or some Google intake form. Once they’re prioritized, you’ll want to publish a list of scheduled tests and even a weekly newsletter to review everything as you go along. This is all about transparency. So let’s take a look at this in a little more detail. Here’s what a Trello board may look like, for instance. Here, we’ve got a very simple board showing different tests in different stages, hypothesis, planning, in queue, live, and done.

You want as much data and visuals within these cards so anyone can look at the board and have a firm understanding of the intended goals, results, and next steps, if any. Moving on, your weekly show and tell should be no surprise. Be weekly. Again, it’s important to build momentum in the early days, so keep it consistent. Now let’s go on to the meeting itself.

What’s important to communicate here? You’re going to want to cover the following topics. What tests are live? What results should we review? What tests are in the queue?

Should we reprioritize what’s in the queue? And last, the ideation phase, what new hypothesis should we add to the queue? Did we see anything out in the market that we might wanna try and might apply to us as well? Okay. So what about the results?

Are we going to have good news to report at every meeting? Unfortunately, not. And definitely not. But just because a test loss doesn’t mean it’s the end of the world. In fact, even the strongest testing team sees only about a 10% success rate.

So as the great master Yoda once said, the greatest teacher’s failure is, I would go so far as to say we should be celebrating failure because it’s a learning opportunity for us. So go ahead and highlight both successes and failures. It’s a learning opportunity in every scenario because you can use those learnings in developing new hypotheses. So please remember to celebrate failures as well. Most importantly, sweeping failures on the rug did not build trust in the process. Here’s the last bit of info about presenting your data.

Make it relatable. You might get excited about a 2% increase in conversion within a multi-step funnel, but zoom out and show the bigger picture of what it could be. And model out the potential impact for the various execs that might be able to influence and support the success of your initiative. Know your audience. And with that, I just wanted to say thank you so much to the VWO team for giving me the opportunity to share some of my thoughts today, and thanks so much to the people that I interviewed for the webinar.

And if anyone has any questions or thoughts, I’d be happy to take them now or later. Feel free to reach out. Feel free to scan that code and connect with me on LinkedIn.

 

Vipul:

Great, that was really an insightful presentation, Arash. It’s very neatly designed, first of all, and, it looks beautiful and conveys the message really, really well. I really like all the practical steps that you’ve mentioned about, you know, assembling a team and, you know, gathering insights and showing and telling. So I think those are really, really little steps. And, thank you so much for sharing them.

 

Arash:

Sure. Absolutely.

 

Vipul:

Perfect. So This is now, the Q and A time, and, I just request everyone in the audience to ask away any questions that you might have, you know, regarding the culture of experimentation in your own organization or, if there’s any particular thing that you want to talk about or, want to ask, Arash, regarding experimentation, you know, feel free to ask. Meanwhile, I’ll want to take a step back, right, and, since I’ve been, you know, hearing this a lot, I just wanted to understand from you since you’ve been a C-level executive for quite a long time now. So what is culture really? Right? And why is it so hard to build anyway?

 

Arash:

Yeah. I mean, culture is a very, very big thing. Right? And there are a number of ways you can kinda answer, and it really starts from you know, the top and how that individual leads and how you operate, that those are all certainly things that play a role in your culture itself. So how your workflows, are part of your culture, in some organizations, they are so design-focused that you can’t get off on trying to put out a test that is not a beautiful design.

They are so design-focused that they need to go through design cycles. So culture varies from, organization to organization. It’s super hard to build the right one that works for your organization, but over time, the most successful organizations figure that out.

 

Vipul:

Right. That makes sense. Absolutely. I see there’s one question from, Darren Russo. Darren, by the way, would you mind allowing us to switch on your mic and ask your questions by yourself, or should I just go ahead and ask the question on your behalf?

I think I’ll just go ahead and ask a question on Darren’s behalf. So Darren asking, what cross-functional teams do you recommend having in your working group, and how many is too many?

 

Arash:

Yeah. So one of the things I would kinda recommend is having a small pod of teams. It’s a small group of teams to at least get things started. Right? And so you may want a representative from marketing.

You may want a representative from engineering. Because you’ll be able to bounce off these ideas very quickly in real-time. How feasible is this? Is this test too much for us to take on? Can we do it?

Do we have the data to support the test? So I would think about, you know, a representative from, you know, some major organizations. So probably about 3 to 4 people, as kind of like the brain trust to at least get that they started and you could theoretically replicate that based on different products that you have within your organization.

 

Vipul:

Got it. I hope that answers your question, Darren. I see Akash also has a question. I think he wants me to switch on his mic. So let me do that.

 

Akash from the audience:

Yeah. Can you hear me?

 

Vipul:

Yes. We can hear you, Akash. Can you hear me?

 

Arash:

Yeah. Okay.

 

Akash:

So my question is, like, how do you proceed with the hypothesis? Like, what are the key elements in the research? Happy to do that, like, if I see a problem I think I need to fix, and I need to establish a hypothesis, what key elements I should put in my research before I decide what hypothesis I should have?

 

Arash:

Yeah. So thank you for your question. So the question is basically about, okay, we’ve got a whole bunch of hypotheses, you know, what data do we need? And also, you know, which one should we kind of start with? So I didn’t touch on this.

It’s prioritization basically, the next step after gathering your hypothesis because what you’re going to want to do is basically rank these different ideas from in their many frameworks for this. I’ll just use the ice methodology for now. Impact, confidence, and ease. So the impact is how impactful do you think that this test is going to be, and confidence, how confident are you in its ability to kinda move the needle of the business? And, you know, the e is ease, how easy is it to kind of, get this test going?

So you’ll want to rank them based on that. And then you’ll be able to have a ranked order of things that you should attack. I hope that answers your question.

 

Vipul:

Sure. Thanks for answering that question, Arash. I see Niranjan has also raised his hand, but I’m waiting for him to revert to my message. Meanwhile, I had another question regarding, something that I have been listening to for quite a while now, and this pertains to gathering confidence in the data. Right?

So what do you do if you’re not confident in the data that you are gathering from all the tests that you’re running or maybe all the insights that you’re gathering?

 

Arash:

Yeah. So certainly came across this in multiple teams. The first step is to pause everything you’re doing and get that data correct because no matter what A/B test result that you put out, it will be questioned. It will be potentially wrong, and you don’t wanna take action on data that could be incorrect. So you have to ensure that the integrity of the data is valid And make sure that what you are testing, you track those data elements in order to reach the correct conclusion.

So, certainly, you’ve gotta get your data down, first, in order to kind of move on to A/B testing.

 

Vipul:

Got it. So, yes, Niranjan has sent in his question. Let me just unmute, Nathan. Hi, Nathan. I have unmuted you. You should be able to ask your question now.

 

Nathan from the audience:

Hey. Hi.

 

Vipul:

Hi, Nathan. Please ask your question.

 

Nathan:

Thank you. Thanks for giving me this opportunity. Actually, we have built an online experimentation platform, and, I can see that product managers and a few of the guys who build features on the app or on the website, often, find our platform useful and they use it at large, but I want to know how I can onboard some of the guys in the operation part, to conduct their experiment on our class form, not conducting the offline experiment and, on board those offline experiments on the online platform.

 

Arash:

I believe the question was how do we conduct offline experiments? Was that it?

 

Nathan:

Yeah. People actually, find it, especially in the operation part they find it relatively very easy to conduct experimentation, pre-post experimentation, but, how can we enable them to conduct their experimentation in an online fashion?

 

Arash:

That may be more of a question for you guys.

 

Vipul:

Not a problem. I think the engine, of course, you know, you can see Arash’s LinkedIn, you know, code here in front of you. Right? So feel free to connect with Arash, after this, webinar and, you know, you can then should, you know, call and, you know, talk and, you know, ask the questions, you know, in a more, well, correct manner. Right?

Yeah. Yeah. So I’ll be unmuting you again. So I have another question from, Marcela. She actually has 2 questions.

I haven’t read them. So, let me check that first. Sure. So Marcela is asking oftentimes teams are interdisciplinary and formed with different roles. But do you think that the members of these teams should necessarily share a common technical knowledge about this?

And how can one go about to repeat the correct way of applying a test? Not sure what the second question means, but the first question implies, you know, is it really necessary for all the members or that of the team? To have an understanding or technical knowledge of the test that they’re about to run.

 

Arash:

Awesome question. Absolutely not. It is not required. Right? And again, it kinda goes back to what I mentioned earlier about having a diverse team from multiple groups.

You want the techies. You want the non-techies, because of your input, you’re coming at the same problem. You’re gonna try to solve the same problem in multiple directions. And so for someone who’s not technical, you make try to come up with, and I’m sorry to say this, a more human, method of solving that problem. And that might be through copy changes.

It might be through a design element, whereas someone on the engineering side, the very simplistic answer here may want to see if, you know, they could speed up the transitions between the stages to kind of push people through your registration process, for instance. So to answer your question, it is not required to have a technical background when you’re forming this kinda cross-functional group.

 

Vipul:

Got it. I actually had one question which is very, you know, very much related to what Marcella has asked, regarding, diversity. Right? So is it, especially for big organizations that are set up globally, right? What would you suggest, should the team of experimentation, should they be spread across the world? Or should there be separate team for each region?

 

Arash:

Yeah. So I think it certainly depends on the organization. What I have seen are pods, basically, that attack cross that attack it on a cross-functional basis. As well as teams that just attack mobile, or just the web. Right?

So the reason why is that you’re going to have domain expertise potentially, within that specific group that might be you know, really important and, very insightful in terms of developing those tests or may have the historical knowledge in terms of okay, why is it like that? And this kinda goes back to you know, the Trello board that I kinda mentioned, right, but you gotta put things in a repository put as much data as you can through there, that’ll help you refine your hypothesis, moving forward to develop better and potentially more impactful hypotheses.

 

Vipul:

Got it. Yeah. Absolutely. Makes sense.

Thanks, Arash. I think, yeah, it’s time to close this session. And once again, thank you so much for sharing your, knowledge with our audience. I’m sure the audience really loved it as much as I did. Thank you so much, Arash.

 

Arash:

Alright. Well, thank you, everybody, and stay safe and stay well. Thanks, everybody.

 

Vipul:

Absolutely. So, guys, yeah, before I close this session, just wanted to remind you that we’ll be having a survey just after this session closes. So please fill in that survey so that we know how you felt about this session. And, definitely, I’ll pass on this feedback to Arash as well. And, of course, you can connect with Arash to share your feedback or ask away any questions.

With that, of course, stay safe, wash your hands regularly, and have a great day, everyone. Goodbye.

  • Table of content
  • Key Takeaways
  • Summary
  • Video
  • Deck
  • Questions
  • Transcription
  • Thousands of businesses use VWO to optimize their digital experience.
VWO Logo

Sign up for a full-featured trial

Free for 30 days. No credit card required

Invalid Email

Set up your password to get started

Invalid Email
Invalid First Name
Invalid Last Name
Invalid Phone Number
Password
VWO Logo
VWO is setting up your account
We've sent a message to yourmail@domain.com with instructions to verify your account.
Can't find the mail?
Check your spam, junk or secondary inboxes.
Still can't find it? Let us know at support@vwo.com

Let's talk

Talk to a sales representative

World Wide
+1 415-349-3207
You can also email us at support@vwo.com

Get in touch

Invalid First Name
Invalid Last Name
Invalid Email
Invalid Phone Number
Invalid select enquiry
Invalid message
Thank you for writing to us!

One of our representatives will get in touch with you shortly.

Awesome! Your meeting is confirmed for at

Thank you, for sharing your details.

Hi 👋 Let's schedule your demo

To begin, tell us a bit about yourself

Invalid First Name
Invalid Last Name
Invalid Email
Invalid Phone Number

While we will deliver a demo that covers the entire VWO platform, please share a few details for us to personalize the demo for you.

Select the capabilities that you would like us to emphasise on during the demo.

Which of these sounds like you?

Please share the use cases, goals or needs that you are trying to solve.

Please provide your website URL or links to your application.

We will come prepared with a demo environment for this specific website or application.

Invalid URL
Invalid URL
, you're all set to experience the VWO demo.

I can't wait to meet you on at

Account Executive

, thank you for sharing the details. Your dedicated VWO representative, will be in touch shortly to set up a time for this demo.

We're satisfied and glad we picked VWO. We're getting the ROI from our experiments.

Christoffer Kjellberg CRO Manager

VWO has been so helpful in our optimization efforts. Testing opportunities are endless and it has allowed us to easily identify, set up, and run multiple tests at a time.

Elizabeth Levitan Digital Optimization Specialist

As the project manager for our experimentation process, I love how the functionality of VWO allows us to get up and going quickly but also gives us the flexibility to be more complex with our testing.

Tara Rowe Marketing Technology Manager

You don't need a website development background to make VWO work for you. The VWO support team is amazing

Elizabeth Romanski Consumer Marketing & Analytics Manager
Trusted by thousands of leading brands
Ubisoft Logo
eBay Logo
Payscale Logo
Super Retail Group Logo
Target Logo
Virgin Holidays Logo

Awesome! Your meeting is confirmed for at

Thank you, for sharing your details.

© 2025 Copyright Wingify. All rights reserved
| Terms | Security | Compliance | Code of Conduct | Privacy | Opt-out