• +1 415-349-3207
  • Contact Us
  • Logout
VWO Logo VWO Logo
Dashboard
Request Demo

Building Engaging Onsite Experience At Avast

In this interview, get a peek into how experiments are run at Avast and what does their experimentation culture look like.

Summary

In this insightful discussion, Chase delves into the company's strategies for converting users from free to paid services. Avast's approach hinges on balancing the quality of their free offerings with the allure of premium features, aiming to convert users without overwhelming them with sales tactics. A significant portion of their revenue stems from user retention and license renewals, highlighting the importance of customer satisfaction and long-term engagement.

Chase emphasizes the complexity of Avast's revenue model, which includes partnerships like offering Google Chrome installations. The company acknowledges that a segment of their user base may never convert to paying customers, yet values them as evangelists and monetizes through alternative methods.

Experimentation plays a crucial role in Avast's strategy. The company started experimenting with conversion tactics during its CCleaner era, focusing on website performance to boost sales. They utilize a mix of Google Analytics, custom data collection, and Microsoft Power BI for in-depth analysis, guiding their decision-making process.

The team prioritizes experiments based on an impact versus effort analysis, considering resource constraints. Ideas for experiments arise organically within the team, with customer support insights being particularly valuable. Avast also fosters a culture of knowledge sharing across different business units, enhancing efficiency and innovation.

Key Takeaways

  • Avast primarily focuses on converting free users to paying customers. This involves balancing the quality of the free product with the incentives to upgrade, without overwhelming users with sales messaging.
  • Recognizing that a significant portion of the user base may never convert, Avast still values these users as evangelists and monetizes them through other means, like bundling third-party applications.
  • Avast's decision to run experiments stems from a need to improve website performance and conversion rates. They use a combination of tools like Google Analytics, Microsoft Power BI, and custom data collection methods to analyze user behavior and inform their strategies.
  • Avast runs a limited number of carefully selected experiments to avoid overwhelming their development pipeline and to ensure that each test receives the necessary attention and resources.

Transcript

Aniruddh: Hello and welcome to ConvEx, where we are celebrating experiment-driven marketing. This exclusive online event is brought to you by VWO, where we help you fix your conversion leaks and help you make the most out of your traffic. I am Aniruddh and I manage Paid and Organic Growth at VWO. Today, with us we have Chase Richards who is the Head of Monetization at Avast. I am excited to have you with us, Chase!

Chase: Thanks very much. Good to be here.

Aniruddh: Same here. I’m sure with our session, our visitors will have a lot to learn from our discussion and really get to learn a lot from your experience running experiments at Avast, and without further ado, I think we can dive right into the questions. So, first of all, I would love to know from you Chase a little bit about your role at Avast, what do you do, what your day looks like at Avast. 

Chase: Okay Sure! So I look after consumer sales and renewals for the CCleaner family of products. CCleaner was acquired by Avast about two and a half years ago I think.

01:05

So it’s mostly CCleaner for PC but there are few bits and pieces a mobile version or a Mac version. That means it’s my job to make sure that all of our sales and retention channels are constantly improving – not just on the website but in the products and sales and also any messaging and communication channels that we have. I have quite a varied and broad role here because I’m one of the people who’s been here for many years and learned how lots and lots of different parts of the business work. But I was originally brought on to improve the website’s performance and it quickly became clear that the conversion journey was much more than just the website. It was all about the products and all the different touchpoints the most important which is definitely the product itself. So the role has kind of grown to encompass everything that we do with all of our customer touchpoints. I also act as a kind of a product or Scrum Master for our website development team. So I create user stories and prioritize the backlog and kind of guide improvements on the website to or new features from design through to implementation. And then I also have a few other hats that I have to wear. So, as the de-facto leader of digital marketing here, I plan the implementation of campaigns across all of our channels and also provide sales forecasting and reporting up to the sea levels.

02:23

Aniruddh: Awesome, that sounds like quite a handful. So I’m sure you have a big team to manage all of that. How is your team structured and how do you take care of so many different activities and tasks altogether?

02:36

Chase: Well that’s quite funny! Right now we do have quite a large marketing team. We’ve grown to the size of seven people which is much bigger than we’ve ever been. For the last three years it’s mostly been me, one other person who handled the PR and [communication] side of marketing, and then we had an analyst, and then we’ve had various designers and developers along the edges but that was the sort of core marketing team. And in fact, when Avast acquired us and they came into the office, they knew how many people we were but they were like well where’s everybody else, where’s the person who handles your social, and where’s the person who handles your SEO? And I was like?!


So we’ve learned to have lots of different hats but, finally, we’ve been able to expand our department a bit more, get some people with more specialist experience in various areas so we can be more productive and in core areas like SEO where previously we have had to just kind of quickly put on the SEO hat and figure out a problem.

Aniruddh: And then that sounds like quite a lean team. So what has been your biggest channel of getting new traffic and where do you see most of your new business coming from?

03:42

Chase: Well actually most of the new business comes via the app. I’ll talk about it a little bit more but the model is quite simple – we just have a good free product that works well and does one thing well, and then it spreads by word of mouth. And for the last 10-15 years or so most of our top of the funnel acquisition was via word of mouth. There was almost nothing spent on acquisition, we’ve still barely ever spent any money on PPC, it just walks in through the front door. Obviously with a very mature product that will tend to plateau and slow down over time so you have to kind of keep putting energy and effort into it, but it’s all very much organic growth, word of mouth, and referrals from major download portals and other websites, and things like that.

04:27

Aniruddh: Absolutely! And like you mentioned, I remember my first experience with Avast and it was because a friend of mine suggested me to try Avast. So that really rings true to home because that is something I’ve seen quite often happen with really good products that customers become your evangelists and so on and so forth. And that becomes the biggest driver of growth. And great to see that happening because Avast is one of those companies that produces really great products and it is great to be speaking to somebody who manages growth at Avast. Speaking of that, like you mentioned a lot of the new business comes from word of mouth and then they figure out maybe they try out the product and then they decide to buy it somewhere down the line in their journey. So what does conversion really mean at Avast when you are trying to optimize or run new experiments for your visitors?

05:16

Chase: So a conversion can mean one of two things: we are usually talking about conversion from a free user to paying user. And the model looks quite similar across most of our products as I mentioned. You give them a free product and then they eventually come and buy something else which is one of the reasons that they wanted to acquire the business. So, in general, we want to turn as many free users into customers who see the value of our added premium benefits and then they go on to stay with us for many years. Aside from the headline revenue number, the actual number of active customers we have is the most important metric to use because that forms the pipeline for revenue renewals next year.

So we’re always trying to build on that and try to get as many people not just making that first step to give us their money but making sure that they stay happy, stay engaged, stay using it so that when we say next year “Hey, we need a bit more money from you” they go “Yeah sure not a problem, here you go. Happy to do it.” But in order to turn free users into those kinds of customers, you first have to get the free users. So, the first step on the conversion journey is to just download and use the free software. And that gives us a challenge because if you make your free product really good and really easy to get, you’re not going to get to sell a lot of upgrades. But if you make it too limited or annoying like a 10-day free trial that you can’t click most of the buttons, you won’t really get any customer base, you’ll just totally erode them.

Likewise, if you hit them with sales campaigns and messaging too much you eventually turn people off. So we’ve got to strike this really nice balance between converting new potential users at the top of the funnel into free users and then guiding those users who might be interested in a more full-featured product towards the final conversion step.

There’s also a big segment of our user base who will never convert; we have to accept that.  We exist to serve those users more than anyone else really because as you said they’re our evangelists. But we do also monetize those users in different ways, we would never really made any bones about this, we offer Google Chrome and as a third-party application through our Installer, if you don’t have Google Chrome we might offer it to you and we get paid a little bit of money that way. So the revenue model is actually quite complex. It’s not just, ‘Okay, how can we how can you sell to this many users?’ It’s a question of right […] where Google pays us X cents per Chrome install, does it make sense to do the hard sell on these guys? Should we maybe just say here’s the free product don’t worry about buying it, just install it. We might make more money that way. So a lot of our thinking about what a conversion is boiled down to what’s the best outcome we can get per user in a certain region or a segment.

08:03

Aniruddh: Right. And like I mentioned there are two sets of users: one is people who have already installed Avast and they’re using the product, and then there’s a second set of users who are totally new users who don’t have it installed. So where do you see more growth coming from? Do you see more growth coming from net new business, or do you see more growth coming from repeat purchases or selling to people who have already been your users or visitors?

Chase: Actually, a lot of our business increasingly is coming from customers staying with us and renewing their licenses. We didn’t really have that much of that before. But, we’ve put a lot of changes in place and retention is becoming a bigger part of our business, which tends to happen as you as a business matures and if you’ve got a subscription base that keeps coming back to you, and a mature product that’s going to happen and more and more of your revenue is going to come from retention and renewals. But we are doing a lot of acquisition at the moment. We have been doing very well. But the top of the funnel has been slowing down and that’s kind of a necessary impact of the nature of the PC market. We sold software for Desktop PCs and everybody knows that that segment is just declining.


Aniruddh: So I believe I think that’s one of the beasts that all the growing businesses face. At one point they reach this stagnancy where it’s difficult to get net new growth because of competition and market dynamics. But then the majority of the chunk which comes in, it comes in through repeat users or repeat customers, and that is where a lot of the marketing strength and marketing effort actually goes into it. So with that, I would love to touch upon when and why did Avast feel the need to actually run experiments and where are most of the experiments actually run on? Is it mostly on new users or mostly on repeat purchases? And how do you decide what experiments to run?

Chase: The decision to run experiments predates Avast actually when CCleaner was still a little private company. The co-founders built the company up to around 20 employees which were serving the needs of about 80 million free users – probably about 50 or 60 thousand customers. We didn’t really know how many it was back then because we just didn’t really have the data. A lot of things weren’t built with data accuracy in mind or tracking, the stuff was built by developers not marketers or analysts. So, then they realized at some point that the website was really not performing well and it wasn’t driving sales, and the product itself wasn’t driving sales but you couldn’t fiddle with that because it’s a big bulky desktop application. So, let’s look at the website. And the conversion rates were freezed, paying users were sitting in around 0.1%, which is very low even with such big volumes.

So the in-store rates even by the end seemed to be plateauing and there was very little growth – there wasn’t growth in the top of the funnel, the website performance had plateaued and they just weren’t reaching all of the customers that thought that they could potentially convert. So, they brought me and the other general marketer on board and we had a dual mandate which was: do what we can to convert more free users and then don’t hassle for free users because as I said they are our bread and butter, we monetize them in lots of ways, they’re our evangelists. And we’ve been trying to do that for the last three years really and we’ve made a lot of progress but we’re still struggling with that really basic paradox where if we tried to sell too hard we’re going  to cut off the top of the funnel.

Aniruddh: Wow and did you really say 80 million visitors or users ?

Chase: Yeah 80 million active users we estimate at the time. It’s more like a hundred and three million now.

00:11:50

Aniruddh: Wow! So, let’s stop and let me grab that number for a second because I think a lot of businesses would love to be in position in the opposition and having that kind of growth without any active marketing and with so few employees is very rare to see in today’s world. And that just goes to show how great of a product they were out to build.


Chase: Yes! 

Aniruddh: So that’s definitely there. Yeah.

12:17

Chase: We love to brag that CCleaner has been downloaded 2.5 billion times, and that’s not an exaggeration. It is a little bit of an estimate because it’s based on our best guesses around various data points but, we think that’s about it. And for the Google Chrome distribution, we have estimated that around 8 percent of all Google Chrome installs worldwide are from us. So yeah we’ve got big volume, big numbers. But frustrating not to get them to cross that line and convert.

Aniruddh: Absolutely, and with that kind of number it’s very hard to show net new growth and even like 0.001% of growth also really means like some really good conversions which are happening. It can actually increase your topline by quite a bit. If you look at the revenue numbers just 0.1%, 0.01% is also significant numbers if you look at that kind of user base. So that’s huge numbers.

Chase: And that’s why segmenting and targeting and personalization are becoming much more important because we can fiddle things and move levers around, and your net effect is usually zero of everything we do.

13:24

But if we if we isolate a specific group of customers or users that might respond better to a different treatment that can somehow suddenly make a difference.

13:34

Aniruddh: So with all these users coming in, what are the different kinds of qualitative and quantitative data that you look at or the analysis you do to understand these various user behaviors, to understand their various user motivations? What are the different kinds of analysis that you do on all these users? Because I’m sure it’s a handful of analysis to do.

Chase: Yeah actually one of the first things that they said to me was that they asked me all sorts of hard data questions when I first came on here and I fancied myself fairly good with GA and the couple of pivot tables but I very quickly said to them you guys need a data analyst and I know just the person. And I brought along one of my colleagues from a previous workplace and between the two of us (or mostly her) we built up the beginnings of a big data architecture. And then that was expanded more and more with access to Avast data systems and their data cubes. And now what we do is we use a combination of Google Analytics and then a lot of custom data collection and processing that feeds into a cube, […] and we analyze this using mostly using Microsoft Power BI but also just various excel sheets that we’ve got together. It collects data from Google Analytics about side traffic but, it also puts us together with all of our eCommerce data, that platform data from our license server, and the product itself. And that’s why it was a big project, about three years in the making and it’s still always ongoing.

But, what that’s given us now is we’ve got all of our different funnels plotted out in G.A. or in Power BI so, I can look at those on a daily basis and see if something’s dropped, something’s gone up, and that’s usually when I’ll figure out okay, something’s broken or something’s working better than expected or some traffic is going here when we thought they were going there. So I’ll spend a lot of my time chasing those kind of thing down, and one thing we found is fixing broken things often leads to better ROI than trying to come up with new ideas or you know, getting new stuff developed.

15:38

So another thing is because our users come to our site once a month to get the latest version, one of the interesting things that we look at a lot is, how many times people come to our site before they end up purchasing. We’ve actually set a cookie, we’ve got a custom cookie set that counts every time someone comes to our website and we’ve noticed that on your first visit, on your second visit, on your third visit, you’re either just gonna download the free version or kind of give up or not get it for whatever reason. But, once it gets to the fifth or sixth or seventh time you come, people start becoming much more likely to purchase.

Aside from all that, that sort of hard data analysis that we do, we also make a lot of use of survey at certain strategic points to find out why users choose to do something or not do something. And we found this data to be really valuable. We had a lot of assumptions about who our users were, about what motivated them, about how long it took them to purchase. The received tribal wisdom from the owners who set up the company was that this is a techie product, the guy who built it was a developer, he gave it to his developer friends to kind of clear out their cache which made it easier to do development projects. And when we started doing surveys we realized that we couldn’t be further from the truth. Yes, we did have a core of techie users, but most of our users were mom and pops at home, whose son had installed this thing on the computer because they complained it was running slow and then every time they said ‘slow’ he just goes and runs it for them and it speeds it up a bit.

And our user base was made up of vastly different kinds of people than we’d ever thought, and in fact, the people who are buying from us are not the techie people because they use CCleaner to a very specific job that it does very well. Whereas the non-techie people have sort of broader uses for the thing and they wanted to do things more automatically for them. So that’s where we’re gonna go after. After this, we’ve kind of used heatmaps and clickmaps a bit in the past.

But the thing that we found is that when your website is basically a big free download button, that free download button just gets lit up like a Christmas tree and everything else is a few dots in it. So it doesn’t really tell us a lot except that people like free things. 

Aniruddh: So, I hundred percent echo your thoughts and the point which you mentioned about surveys, I think that’s bang on! A lot of people, especially within marketing, have some preconceived notions that our users are from a certain geography or they’re from a certain customer type, whether they are enterprise or they are mid-market or they are SMBs, and whether they’re buying it from us for a certain kind of feature or something or the other sort, right? So that’s always something that’s very good to confirm by hard data and that’s where surveys is actually really really helpful.

And that’s one of the things which we also found out that a lot of the users when they were buying VWO, a lot of times we go with a preconceived notion that most of them are product managers or most of them are just marketers, but with our survey we actually found out that a lot of these people were user experience designers which was a new revelation for us and that actually helped us mold our internal marketing efforts to a great extent. 

And I also went through the Avast website, and I found out that the website is very minimal, it’s very focused on just one conversion action, which is download it or buy it. So it’s very straightforward. There’s no BS around why we are great or why we are not – it’s just simple straightforward. So, that’s something that I really loved about the website and I’m sure it really helps boost the conversion  because people are coming for just one single reason to either download it or to upgrade it, and that’s what they’re going to get. And especially with the non-techie people it becomes really impactful that you take them directly to the homepage with just one single action and help them do that action as soon as possible. So that’s definitely something that we also learned. So, great to hear about your insights.

19:27

I would also love to touch upon your point of the Data Analyst, and that’s I think one of the skills which is very underrated in marketing today because with so much data that is being generated with hundreds of tools that we marketers use, I think that’s one of the most important things or one of the most important team members that we need in our modern marketing teams today. 

Chase: Yeah, absolutely. Well, we’ve always been big on data and Avast is obviously much bigger on data. Just like many of the other large tech companies of today, data is their strength and Avast wouldn’t be able to, you know be the quickest at fighting various virus threats and things like that if it didn’t have millions of millions of data points coming in real-time, you know, that data is used so extensively and without giving away any trade secrets. There is literally a room full of people watching viruses happening in real-time going – there’s one. Get it! So that’s really important and the data collection is really really vital to this whole company. 

Aniruddh: Absolutely! So like we spoke about how let’s say all these users that are coming onto the website, you figure out what are the motivations, you ask certain questions to them, and you also figure out where they’re spending more time on the website, you figure out how the heatmaps showing up all these things. So, how do you then decide, what would be my CRO priority? Let’s say if you are planning out, two weeks or let’s say months ahead, you have a certain goal, you have a certain conversion rate that you’re tracking and you want to improve on that. So how do you figure out your CRO priorities in that sense? Like how do you suggest that let’s try out the headline first or we should try out this changing this image first or you should try out this CTA text first or, so how do you figure that out?

21:16

Chase: Okay, so we tend to look at everything through the lens of kind of an impact versus effort analysis. We’re obviously a small team and there’s lots and lots of things that we can always do. You got guys from corporate headquarters saying, hey, you should move this button around and you know, you got someone else saying like when can we get this up on the website, I want to redesign the blog. And there’s also lots of quick wins that you can chase after all the time get an extra couple of quarter of percent here or there. But it’s very easy to kind of get stuck in this thing where you’re trying to go after everything all the time. So we also look at it through the lens of essentialism as well and go ‘Well, is this the most important thing we could be working on?’ And every now and again we have to periodically review what’s on the backlog and the backlog grows and the backlog grows, and we’ve got a design backlog, and a marketing backlog, and a web dev backlog.

And we actually did this exercise reason recently last week where we went through a web development backlog of about a 120 items, some of them that have been sitting there for over two years. And we just went like, ‘has anyone asked about these things? No?’ and we just deleted the whole backlog and everyone went ‘well you can’t do that, you can’t delete the backlog.’ And I said – well what’s gonna happen? And they said – well, these things aren’t going to get done’ and I said great! You don’t have to do everything and so we spend a lot of time because we have all this data, we can look at things and we can sort of very quickly go – Yeah, that’s lots of traffic but it’s sort of not really that much money, so it goes sort of over there on the board. And then you say well that’s not a lot of traffic but these people spent a lot of money so it goes over there on the board. And you can very quickly say, alright, well, that’s going to take us three months but it’s worth loads of money, that’s going to take us three months but it’s worth nothing, that we can do this week and it’s worth more than all of them, so we’re going to do that thing.

23:08

Yeah, that’s pretty much how we would do it. We always try to weigh every single thing up against the questions of how much is this worth or how valuable is this to our users compared to the other things that we’re doing. And then is this the most important thing we could be working on right now. 

Aniruddh: So when you are trying to figure out whether this thing is going to be important to the end conversion goal or not, what are the various things that you keep in mind? Is it the effort or, because before you run the experiment it’s very difficult to understand what would be the end outcome of that experiment. So, how do you then prioritize, okay, let me do this thing first because I think this is important to do?

Chase: Well I kind of have a hierarchy of  how frustrating or difficult something is going to be to implement. So there’s some tests that we as a marketing department with some basic HTML knowledge can deploy with VWO without breaking anything – those ones are the easiest. Then there’s gonna be something that I am gonna need one of the developers to write something for me, to deploy it before I can test it – that’s you know more time. Then if I need a designer to first do me some graphics or some assets – that’s even more time. And then if I need to do anything to do with our products or our licensing system, that’s you know in the future because we got one guy that deals with the licensing system and the product team is always working really hard on, you know, they’ve got a three month roadmap they know. So, if you want something in there something else has to go out. So it’s usually that question of which teams are going to be involved like can we do this now by hacking something together in HTML? Or do we need to get web devs, designers back in the infrastructure involved and it kind of goes down the chain like that.

24:51

So we tried to identify those things that do need that more intensive work or architectural work to be done as quickly as we can because they can really slow things down and if you’re waiting for something that needs a change, the license server, for example, you could be waiting for months. 

Aniruddh: Absolutely. With all these experiments that you mentioned that or probably the ideas that come up so do you guys like, work together to get ideas or anybody in the team can give ideas to anyone or how do you collect all these ideas together? 

Chase: So, yeah idea generation is a very organic process around here. We don’t have any formal templates or anything, but because we all sit together in one office, we use slack a lot, we constantly review all of the changes that we’re doing to things, all of the new features that we’re building, and these ideas kind of come up organically. We try to follow an agile principle of presenting things that you’ve got on the go or that you’re building as often as possible. Especially when we’ve got new big product features or changes in the website or marketing campaigns, we try to make sure that everybody knows about the stuff all the time and that’s mostly just for keeping communication good in the office and making sure the projects doesn’t surprise people. But it has a side effect where everybody is aware of the state of the application, the email channel, the website, the in-product messaging, and all of that all the time. And so you start getting ideas coming up from QA testers and from the guys in the business sales team and they say, ‘Hey, on your consumer page you could be doing this.’ And these ideas sort of come up all over the place and we just usually note them down in our kind of, either our own spreadsheet or our notes or we just keep them in a slack channel together and then go over them and prioritize them fairly regularly.

So yeah, I think ideas tend to come from everybody knowing the product really well, knowing the website really well and also caring a lot about it and seeing stuff that they want to improve, and looking at the feedback that our users give us and we always share things like that. We encourage everyone to read through survey responses and things like that because every once in a while someone will pick something out and go – I wonder if other people are having this problem. So yeah, I have to say that ideas sort of come from everyone. And also a lot of the time we get inherited ideas or learnings from one or the other teams that have asked and they might have a similar product or they’ve had a similar problem before. So we try to borrow some of their learnings and their insights as well when they might have done something that we haven’t done yet.

Aniruddh:  Absolutely and I would also mention one of the channels which we have seen has worked really well in helping us generate ideas is the customer support channel and talking to customer support guys. Because a lot of times, these users will come up to them and say, ‘hey we were trying to look for XYZ and you’re not able to find it, or raise a number or things like that.’ So that really helps us figure out what are the users actually seeking when they come onto the website. And that’s why a lot of times we run these surveys on the website inorder to ask them, did you find what you were looking for? And how quickly did they find it?

28:05

How would you rate our website experience? So all these things really help put things in perspective whether we are able to serve the users effectively or not. So that really helps us a lot in that fashion.

Chase: Yeah, and we actually have a tradition here that when you start, it doesn’t matter sort of what level you start at. And in fact, we just had a new general manager who had to do it as well. You spend the first three or four days doing a couple of hours of customer support a day, answering tickets in Zendesk and just getting to know what frustrations and problems and things our customers usually have. And it is amazing for getting new people on board to understand how the product works, because you must know that in order to solve people’s problems, and just what frustrations people have, what are the challenges that we’re dealing with from the customers’ point of view, not what we say here is our problem, what our technical frustrations are; what are the customers dealing with? 

Aniruddh: I hundred percent echo your thoughts because that’s something that even VWO started off doing really well because a lot of people really swear by the support that VWO has, and that’s one of the reasons why we got there. Because a lot of the people who used to join early on, especially the first hundred employees, they used to go through all the customer support tickets, infact answer a lot of the customer support tickets themselves in order to learn what are the customer frustrations, even why they were buying VWO, and what were the main problems with VWO. So all these things really helped put things in context of really understanding the end user which should be the end goal of all the businesses if you want to serve them well.

With all these new ideas that you have do you us any other tools to prioritize them, like you mentioned you use spreadsheets and excels, do you use anything else to record these observations, because let’s say observation can come from anybody. So let’s say somebody sees a screenshot of something broken somewhere and they might send you a screenshot on slack. So how do you keep track of all these things to get your ideas together organized?

29:59

Chase: So we literally have a big Google spreadsheet that I set up three years ago and it’s still going, and any ideas kind of get dumped in there and then from there they can turn from ideas to test that are in progress to implementation. We also record when we implemented them and any noticeable uplift that we got in there. And that spreadsheet is not hundreds of lines long, and it’s quite bulky but it contains loads and loads of knowledge.

And aside from that Avast also has a big kind of like Wiki-style knowledge portal where people are always encouraged to share things that they’ve done, and it’s really important to do that because there’s lots of different business units working on different things – there’s not only in the utilities businesses; there’s CCleaner, they also have AVG TuneUp and there’s another one for Avast as well. And these products have very similar user bases, but they’re quite different when you actually get to use them and they’ve got very different value offerings. But they’ve quite often been through a lot of the same kind of journeys and the same challenges, and sometimes we find guys trying to figure something out that we’ve already solved and we say well, you should just try this and it’s really great to see when they say, oh, we did this test a year ago, and they often put together slide decks and things like that. So, we try to do that as well when you can go, oh look they tried this against this against this, that worked the best. And so you don’t have to do as much guesswork.

So sharing knowledge around teams is really important because with such a big organization, you can find three or four guys trying to do the same job at the same time without even knowing about it. 

Aniruddh: Agreed. So how many experiments do you run simultaneously?

Chase: That’s tricky. So we mostly do two or three on the website itself, but there may be other tests running in in-app messages or emails. We just try not to make our lives difficult by running loads of tests at the same time. The main reason for this is that once the test is successful you then got to go back to your developers and say ‘hey, can you implement this please.’ And if we were just to do dozens of tests all the time, all we do is end up with a really big backlog, and often these things can kind of conflict with each other – you run tests A and it says, alright do this. And then three weeks later, or a month later, you run another test, you kind of forget that you said – alright, can you change this button to green, and then you got another test that says can you remove this button. So we try to keep things tight and really limited to a small number of tests that are really good.

But we do have lots of other lines of businesses as well, and lots of different channels that can be optimized. And we’ve got our ecommerce in our cart as well things that dynamic testing platform don’t play very well with it because it’s got all sorts of dynamic pricing and things like that and chances are I’ll break something, so they test things separately and we have to kind of marry-up these tests and collect all the data separately.

And then we might also be testing within the app itself, but that’s a little bit more cumbersome – it’s a fifteen-year-old C++ application and it’s not nearly as flexible as testing on the website will be. Then other lines of business like the business team, the guys they sell products to SMBs and that kind of thing. They are often running about four or five different personalization campaigns on the website at a time, but there’s a very small audience of a few hundred people because you’re talking to a really specific type of customer who knows pretty much what they want and just needs to be taken to the right place.

33:32

Aniruddh: Okay! And so with all these experiments, how do you keep track of all these experiments? What is running where, and what is the end outcome of certain experiments? And how do you plan out the next steps for the experiment? How do you track all of that? 

Chase: As I mentioned we have that big Google spreadsheet that I keep track of everything in. We also use a Trello board. We’ve tried lots of different sort of product management, project management, roadmap management tools and we realize that in our heads we kind of organize everything in a Kanban style according to the tickets so we just we just went back to Trello, and we really really enjoying it. We’ve also hired a new CRM manager finally, who has a much more rigorous professional record-keeping than I do, and he’s built a nice visual way to see at a glance like what we’ve tested, what worked, what hasn’t been tested. Yeah, and then we try to keep a relatively short life cycle on these things. So, if we come up with a test and we can do it ourselves, run the tests may be within a week, maybe two weeks and then it’ll go into a 2 week dev sprint, and then, we can get it deployed now within a month. And that sounds like a fairly long time, but when you want to be really sure about stuff, and you want to make sure things are built and develop quickly and QA’d properly then that’s usually how long it takes.

35:00

So we mostly will do conventional A/B tests, as I said earlier, sometimes our developers will build these for us, but for some of the more complex tests where we’re looking for behavior across multiple regions, we might use personalization campaigns, and then will collate the data using our Cube or Google Analytics. I mentioned briefly that we’ve got an eCommerce platform that we run tests separately on, and we will do certain things like if I want to test prices I can use VWO to change the prices all over our website, but the eCommerce platform isn’t gonna know that I’ve done that. So, I have to have a parallel test running on the eCommerce platform and a test on our website, and then connect the two together, and then collect the data off separately, and then run the same kind of either Bayesian analytics or something like that or a digital frequentist analysis but basically to give me the kind of dashboards that VWO would normally give me because, if you look at just the one half the test, you might not necessarily get the right things. 

And then usually we’ll kind of run these tests for about a week. We always try to get at least a seven day period but actually we have this sort of 30-day business cycle where we will release a new version of the product, we get the traffic spiked, and those people are much more engaged people, they are the guys that update every month, and they’re different users for the kind of people who come for the rest of the month. So if we run the test during one of the quieter periods or only over the spike, then we will get totally the wrong idea about what’s happening.

Aniruddh: And so, let’s say, with these experiments how do you split the audience? On what factors or parameters do you divide the audience or you divide the incoming audience and create the test?

36:49

Chase: We use geolocation quite a lot because we’ve noticed very big differences in behavior across different regions. The U.S. obviously converts really well, places like Brazil and Russia don’t, they basically only get the free products. But then we also pay a lot of attention to how long someone has been our user or our customer. As I mentioned, we have a counter that counts how many times someone has been through our website and downloaded our product again and again, and we’re trying a lot more with segmenting around that behavior because we believe and we have the biggest hypothesis running is that – and it’s not too far-fetched – is that the people who have been using our product the longest and our most sort of regular users are the ones that are more likely to convert. It’s just a question of figuring out exactly who they are, and giving them the right experience at the right time. 

Aniruddh: I’ve also noticed that you guys use device information as well in order to personalize at least the website experience at least for example, for Mac vs. Windows. I think that’s one of the more key examples that I’ve seen. 

Chase: Yeah, I mostly do that to avoid cross-contamination of Mac users and Android users. Obviously being a business that sells a piece of desktop software, our mobile traffic isn’t quite what a lot of people are used to seeing. I’ve seen anywhere from 8 to 25 percent mobile traffic on, you know, talked about in presentations and talks and stuff. We’ve got more like 1% of mobile traffic. So, I don’t often bother excluding them very often but I will exclude Mac traffic because our Mac experience is very different. And if you arrive on our site on a Mac and get put into a test,  it might break the Mac experience for you. We think a bit about what device people are coming to our website.

We also think a lot about screen resolution as well. One of the things we worked out was that our designers, everyone else uses a little PC, but our designers have these big beautiful iMacs and they were designing things for big beautiful iMacs. And our customers have a seven-year-old Lenovo laptop with an 800×600 resolution. So we started segmenting a lot by resolution and figuring out ‘okay look guys, those using the grandma laptop are behaving totally different because they can’t even see the ‘Buy’ buttons.’ 

Aniruddh: And that’s one of the key things which I’ve seen has a lot of impact because it really determines your entire user experience. Because with different resolutions the experience totally differs from person to person, and that’s one of the key things that really helps you separate your experiments versus your competitors’ experiments. If you have a very personalized experience for a certain resolution, it really goes a long way in user experience. Absolutely love that point.

39:33

So, can you share a bit of examples, maybe one or two really good experiments that you ran which you can remember from the top of your head and we can learn a lot from?

Chase: Yes, one of the things that, it’s a little bit cliche but, pricing discounting really made a big difference to us. Early on, the founders weren’t keen on me fiddling with the prices or changing the prices. They thought they had it right. But, we ran some exit surveys and realized that one of the reasons that people weren’t buying our products, and we were asking them when they left the cart without buying, you know, ‘what stopped you?’, and they just said the price, and well I said, ‘well that’s an easy thing to fix.’ So we started doing some discounting on places where regular users would come through. I didn’t want a discount for everybody and be like a kind of a mattress store in the UK – they have a stereotype for always doing a discount all the time. But we realized that on certain places where regular users come through, let’s give those guys a discount because they are our loyal customers. And that made a massive impact right away, and it kind of gave me a license to do whatever I wanted on the site because the owner said ‘well, all right, fine, you’re making money.’

40:43

It also just told me that the price point was wrong about the product and that there was a bit of rethinking to do about how much customers are willing to part with to get this product. Another thing that was quite surprising is that we found that when you’re showing discounting, if you say $5 off it’s usually much more effective than saying 20% off, and the theory we’ve got going is that people can visualize $5 in their wallet much more than they can visualize 20% in their wallet. But whether that’s the real reason or not, we’ve had pretty consistent results in applying that, and I went through the site and changed all of the 20% off to $5, $10 off or whatever, and that’s lifted everything quite a bit.

Aniruddh: Amazing! And have you seen any other kinds of experiments which yielded big results because I think the majority of the times, if you run 100 experiments probably 20 of them, will yield the maximum results, right? So let’s say pricing is definitely one of the most crucial things to get right, and that’s true for any business. Are there any other things? Because let me talk about a lot of our visitors may be running let’s say, B2B campaigns where they may not have pricing on their website. So, is there any core take away that they can use from your experience? 

Chase: Yes. So we run a test that I don’t know, we were very skeptical of, I wasn’t particularly proud of it. But, we have a fairly common element on our side which is a feature table which compares, the bronze, silver, and gold packages, everyone’s pretty much got one of those. And I had been testing this thing to death since I started and changing the language, or changing the order of things, changing the styling of it and it had a bunch of green ticks next to the things that it did have and just a blank space next to the things that it didn’t have. And someone said to me: ‘well, why don’t we put like a red cross there now instead of nothing for the free version.’ So, say like, you know, it’s very very clear that the free version doesn’t have these things. And that worked really well, and they said well, maybe it’s not the words on the page exactly but they’re just counting green ticks, and I said that can’t be right. But very cynically we did another test where we just added another another red cross to the free column and another green tick to the premium column, and that made a massive impact. And I realized that I’d been testing the copy on this page and no one as reading it. They were just looking at the number of good things that the pro version had in making that decision. And so well 3 that’s not very many good things extra but, 5 that’s a lot of extra good things. So that kind of bump my cynicism up a little bit but it was definitely quite impactful and it made a lot of difference for how we styled those feature tables in the in the future because more things in this and it works better. 

Aniruddh: I really love that example because it goes on to show how very small things can have such a big impact. And that also shows, even the example which you mentioned of $5 versus 20%, that goes on to show how much of a role consumer psychology or consumer behavior plays in that entire CRO equation. And that’s something that a lot of times we tend to miss out on because the best of the things it’s probably very difficult that the users will tell us. That’s it something that is ingrained in our consumer psychology because at the end of the day, we are all just humans trying to optimize experiences. So that’s a really important insight that a lot of people can use. So thanks a lot for that.

Apart from that experimentation, like you mentioned, has a lot of key benefits for the end users as well as for the business, right? So how do you see it shaping the culture at Avast? Do people are more open to experimentation as an idea? And how has it shaped your experiences internally?

Chase: Experimentations finish really big part of the culture at Avast long before they acquired us and it’s just kind of assumed that everything that we do goes through the process of experimentation and iteration before it sees the light of day. And that we regularly review old assets, old campaigns, old messages and just go through that cycle of experimentation again. So, we don’t really have that problem that did you often hear organizations talk about, the HiPPO problem – the highest paid person dictating how things should look or what they should say. You know, the CEO would come and say we need to test that not say we need to change it to do this. Yeah, so it’s definitely a really ingrained part of absolutely. 

Aniruddh: Absolutely, and I think that is one thing that separates the best-performing marketing teams versus the average or mediocre marketing teams in general because right now it’s all about the speed of experimentation – how fast can you experiment – let’s not assume things but let’s test them out and let’s figure out what’s the outcome from that test so that we have a fair chance. Absolutely. Love that point. 

Chase: Yeah!

Aniruddh: So, apart from that, what are the different books, blogs, or podcast do you read or do you keep track of to stay abreast with what’s happening in marketing, to figure out what new things can you experiment, and just to keep yourself ahead of everybody else?

Chase: So yeah, I was quite influenced early on my career by a book called Webs Of Influence by Nathalie Nahai. I saw her speak at a conference here in London. I have a little bit of a background in Psychology, my wife is a sociologist as well so, when I read that book a lot of the kind of basic knowledge I had in these areas connected and lots of light bulbs went off. So I based a lot of the earlier things that we tried on that book. And then being in London, very fortunate that there’s lots of tech events, lots of marketing events. We have things like The Social of London and SEO Brighton, and some of the some of the biggest events in the world happening here in London. So we do our best to go over to those. Don’t get to go to as many as I used to but a lot of my knowledge and my career was based on things that I went to some conference and heard someone say you can try this and then I did it worked and somebody noticed, and that’s how I sort of went from a really sort of Junior Digital Marketer to running a Digital Marketing Department. It was kind of poaching ideas that I picked up at conferences and making them work. 

Aniruddh: Fantastic! And, Chase, really really loved the conversation. How can our audience connect with you?

47:22

Chase: So yeah, the best thing is on LinkedIn as a linkedin.com/chaserichards just one word – I was the first one to get that one. And aside from that. I’m quite terrible at networking so I don’t really have professional social media accounts. LinkedIn is where to go.

Aniruddh: Great. Thanks a lot Chase for spending time with us. I’m sure our audience got to learn a lot today from you.

Chase: Alright. Thanks very much for having me. It’s been a pleasure.

Speaker

Chase Richards

Chase Richards

Head of Monetization, Avast

Other Suggested Sessions

The Testing Maturity Model Used at Dandy

Dive into Sara's dynamic talk on mastering the Testing Maturity Model, where you'll unlock secrets to transform your marketing with continuous, insightful experiments. Get ready for actionable takeaways!

A/B Tests: Two Important Uncommon Topics: The OEC and Trust

In this live workshop, Ronny will refresh your understanding of OEC and importance of trust in experimentation.

Building a Center of Excellence for Experimentation

Discover Atlassian's journey in building a successful experimentation COE, integrating data-driven culture and innovative strategies for growth."