VWO Logo VWO Logo
Dashboard
Request Demo

Step Into Your Customers' Shoes With User Research

In this interview, Brian shares the methodology that can be used for user research, and how businesses can squeeze out insights from user research studies.

Summary

Brian, an expert in digital marketing and conversion optimization, discusses the evolution and importance of user research in the context of website design and optimization. Initially, decisions were based on best practices and personal judgments, but with the advent of A/B testing tools, the focus shifted towards data-driven strategies. Brian emphasizes that user research is crucial in determining what to bring to an A/B test, which he considers the "Supreme Court of data." He advocates for the involvement of all digital marketing professionals in experimentation, moving away from traditional design-first approaches to research-first strategies.

Brian also addresses the challenges of interpreting user testing data, the need to be comfortable with inconclusive tests, and the importance of integrating risk management into the design process. He suggests starting with tools and methods that are familiar and then gradually expanding to more complex analyses, emphasizing the importance of hypothesis-driven testing.

Key Takeaways

  • All digital marketing professionals should engage in experimentation, moving beyond traditional design-first approaches.
  • Recognizing the importance of learning from inconclusive tests and their role in risk management.
  • Start with familiar user research tools and methods, then gradually explore more complex techniques, focusing on hypothesis-driven testing.
  • Incorporating experimentation in the design process helps reduce risks associated with major changes.

Transcript

0:06

Rahul: Hello everyone. Welcome to ConvEx where we are celebrating experiment-driven marketing. My name is Rahul Jain and I’m a Senior Product Manager at VWO. So today we’ll talk about user research and ways in which you can easily understand what your customers need. I’m excited to talk to Brian Massey today who is the founder at Conversion Sciences. Glad to have you Brian. 

Brian: Thanks for having me. I’m glad to be here.

Rahul: So before I begin the conversation with Brian, I want to inform you all that you can join ConvEx’s official networking group on LinkedIn and connect with like-minded people. So Brian, could you please describe your role and responsibilities at Conversion Sciences? 

Brian: Well my roles and responsibilities are mostly to stir things up and make trouble. I founded Conversion Sciences and I’m primarily responsible for being the spokesperson. So I do most of the marketing, writings, presenting, we speak around the world. When I started Conversion Sciences in 2007, I really had to educate the market on what we now call ‘conversion optimization’, the vocabulary, the processes, the disciplines.  It was new believe it or not. Even though no one’s ever put up a website that they didn’t want conversion.

1:31

The process of optimizing was new so we started very early speaking and teaching and educating folks on things. So, that’s my number one job. I also work with all of our conversion scientists here as the Senior Conversion Scientist and bring my years of experience in marketing strategy to the tactics that we implement and the tests that we design.

1:58

Rahul: Perfect! So I think you had rightly said that, you know, these days, everybody is like sort of interested about conversion optimization and also everybody’s talking about user research and user-centered strategy. So what do you think about it? What according to you is user research?

Brian: Well, if you kind of look at the Spectrum of our journey when I first started it was primarily based on best practices. So, I would go and interview everybody but there wasn’t very much research behind that so I would say here are the things that you need to change on your website. Good luck. So over time it became clear that as new tools came on the market that we could actually try out some of these ideas. And of course when the A/B testing tools came on we were able to collect really statistically significant data from the people who are actually out there searching for solutions to whatever problems that business solves,  problem that business addresses. And that was very eye-opening when we start doing that because we realize how many of our best practices really wouldn’t work for a particular audience. They might work here but not over at another business even if it’s in that same industry. And so it became really clear that we had to get good at figuring out what we would ultimately bring to an a/b test which we consider the Supreme Court of data – it’s the final decision because it’s the best data that we can collect. So that’s where user research really came in as important. First of all tools are amazing. I can bring a hundred people to look at a couple of sets of creative and see if that creative is helping them solve problems for very little money now using online tools. I don’t have to leave my desk. I can run eye tracking studies from my desk.

3:55

So we use user testing to decide which of our hypotheses we can just implement because of the user testing or which we need to take to an A/B test and measure their impact on the fortunes of the online business we were working with. 

Rahul: So I think that’s a very interesting point you’ve mentioned. So 2 years back when we sort of started the conversion optimization platform, the biggest problem that even we saw was where people were just doing testing based on their assumptions or they were sort of blindly following best practices and that is where you know, even we felt that there is a very strong need of doing proper user research. So another question that we usually get from our customers as well is that you know, who is like the right sort of user for doing user testing. Is it just the ux designer or who else can actually do the user testing part.

Brian: For the last three years from the stage, I have been preaching that everyone is going to have to be an experimenter if you are in digital marketing.

4:58

These are tools that, the days of what I call the Mad Men style of design don’t fit anymore. When you go to your team or to an agency, you say we want a new design for this page or new design for this website, they usually start with wireframes and that’s backwards. We need to begin using research on our existing site on other pages or putting things in front of the people to make that decision. So everyone needs to be doing this and we’re already naturally wired for Behavioral Science. We know how to grade the data. So for instance, if we put creative in front of a panel of people that I call ‘Pretenders and Liars’, they are people who were trying to understand if a certain design or certain content is helping them figure out how to solve a problem that data is less valuable than if we are looking at analytics with thousands of visitors and seeing how they’re behaving on the site determine what’s working and what’s not working. 

So we know that a focus group, these online focus groups essentially are… we have to handicap the data a little bit. But if we have evidence linking an analytics, and we have evidence in a heatmap report, and we have evidence in an eye-tracking simulator for instance, and we then have evidence from a focus group together those things if they point the same direction reinforce that whatever we’re testing, the creative or the idea, is in fact is in fact real. You wouldn’t necessarily take any one of those as enough evidence on its own to make changes to your website. Does that makes sense?

 

Rahul: Yeah that totally makes sense. But you know, I have a sort of related question. So you mentioned that, even you guys are doing a lot of user testing and there are so many tools available these days for doing user research – there is eye-tracking, there are session replays. So the problem that we see today is that, it all requires a lot of resources and a lot of time and we have seen companies consider this as a waste of time or probably, you know, it requires so much effort, a lot of time to be invested. So what is your take on this and you know, how do you think businesses can do it in the right way so that they don’t feel that this is a waste of time?

Brian:  So the reason I think people need to be jumping in now is because there’s a few things we have to learn and probably the first one we have to learn is how to be comfortable with inconclusive tests. So you take two very different pieces of creative you put one of these creative in front of a panel using a tool and the other one in front of a tool and the they both behave about the same even though the creative is very different. That might be seen as a waste of time right because we didn’t really necessarily learn anything. But a lot of what we do as marketers is also getting insurance making sure that we’re not about to put something out in front of people that is going to significantly make it harder for them to solve the problem to get through the website to get to the purchase process whatever we’re testing. So that’s one of those things that we just have to learn after experimenting quite a bit is how to be comfortable with inconclusive test and then how to communicate to the rest of the organization that it’s not a waste of time. 

What we’re doing, I always like to tell the story about Finish Line who completely redesigned their website. If you look at their website in 2012, it was you know, 1990s looking lots of gradients and things that we associate with older websites. They completely redesigned, it was a total change very stylistic, stylized images, a very minimalistic approach to the category pages, they brought in content from the blog, they had spokespeople and they launched it all in one day and over the next few weeks summarily lost about 3 million dollars in sales. How does that kind of an effort get all the way to launch and the executives in the agency involved not realize that they’re about to shoot themselves in the foot. What we’re doing with these research is we’re moving that risk, that all in launching hope, launch it and pray risk into the design process. So if you launch the thing you spent all your resources, you put all your budget into bringing traffic to the landing page you launch isn’t that a more of a waste of resources than doing some of these experiments along the way to make sure you’ve got the right copy and position, the right images in the right layout before you launch. I would say that all you’re doing is moving risk to a less expensive time in the design process so that when you do launch you have evidence that the design you ended up with is actually going to deliver what your visitors want.

10:18

Rahul: So that actually makes a lot of sense and in fact, you know while talking to our customers as well one problem that I have personally realized is that, people do understand that, you know, there are benefits of  doing a pre-test analysis and also look at the post-test analysis results as well. But you know one thing that I’ve noticed is that people get overwhelmed with so many tools so many fancy data points and I’ve seen them looking at heatmaps and get excited about it. But is there a right method as per you that they should follow in terms of doing the user research. Should they look at quantitative data first or should they look at qualitative data? How should they approach it? 

Brian: So I think the most important thing they can do is not expect to open up a tool and look at the results and be able to discern what’s going on. So I’m not you know, I think it’s fine to go and look at the heatmaps on your pages and see what’s going on. But more powerfully take a specific question, and for instance, the classic questions on the landing pages, we want to get the right combination of headline and imagery so that people very quickly understand what we’re about what we’re offering and how they can take action. So you come with a specific question which is – is this the best headline? or should I use long form copy or short form copy? You know in that latter question, you can specifically go and look at it your other landing pages- look at the longer ones and the shorter ones and see how far people are scrolling. It starts to answer the question if people aren’t scrolling very far then perhaps for your audience shorter form copy is going to be sufficient. So go with specific questions to the tools, don’t try to do an all-encompassing survey or don’t expect to be able to look at analytics and necessarily understand what’s happening by creating the falling characters in The Matrix. It’s not so much like that as bringing very specific questions about design, layout, image, copy, to these tools and seeing if you can collect some information that supports one over the other. One of the things I always recommend is if you’re struggling with copy on a landing page headline and such, go look at your AdWords account and see what ads are generating the most clicks. Those words are the words that you should be using because that is what your audience is reacting to. So you can find data in a number of different places to inform what you’re doing, asking specific questions as opposed to trying to be a data sieve onto some sort.

13:06

Rahul: So another thing that I feel, I talked to a lot of customers about this is that they should actually look at their business goals. They should understand how that fits into their business funnels and that should probably be starting point. So what is your views on this? Do you think that starting with the business goal and then sort of your own deep diving into may be qualitative data may be looking at heat Maps around your goals. Would that make sense? 

Brian: I think it would make sense. There is a reality, especially in larger organizations that there’s a lot more to what you’re doing.  At the same time that you’re trying to get your your quarterly and yearly goals as a marketer generating leads and sales, you are also tasked with educating your organization, being a team member, you’re working on your career advancing your career. So there’s a lot more to it I think if we’re honest with ourselves than just optimizing for the bottom line. People hire us because we have a team that is not in all of that; were from the outside. So we can completely focus on those key metrics that drive the business be it sales, lead generation, sales close rate, turn rate – all of those things are right for optimization. Sometimes it behooves you to have somebody on the outside to stay focused on those things because in a complex organization, you’re dealing with the politics in the day-to-day that you just can’t avoid. 

Rahul: So I think that’s pretty interesting. And in fact another similar question that I hear a lot is that people come to me and say that ‘hey, we have let’s say a million visitors coming on our websites and we have collected all the data in your tool and we just don’t know how to squeeze the sort of you know insights from the user research data’. So do you recommend any framework for this? So, you know, what is your take on whether we should follow a framework or it should be based on some other factors. 

Brian: Yeah. I think it depends on where you are in things. So if you’re new to testing and using data, you need to plug into those things that I think attract you. So if you have a tool like VWO and you know, it has this heatmap capability and that’s something you’re interested in start there. If you have a background in familiar with writing surveys to get customer feedback, I would start with bringing in a panel and letting them see different forms of creative and seeing if you can help them get through some sort of a tasks more efficiently or understand the page more correctly. To your earlier point, there are a lot of tools I get pitched four-five a week easily and you don’t have to become a MarTech specialist. Find those tools that are comfortable with you, maybe leverage some experience that you already have and start there. If you are already doing some testing then my recommendation is that you really focus on the questions. So you should have a place that you visit frequently and it collects ideas. You know, the benefits of data are beyond just more sales, more leads, growing the business. Data helps you manage your relationship with your executive team, with your boss, especially when things aren’t going well. Bosses tend to helicopter in and they want to review your copy and review your designs. Data can help put them at ease and say well we tested this and so we have confidence that this is going to work with some user testing for instance. It allows you to be more creative – especially again if we’re behind in our goals, we tend to start playing it safe. But as a conservative organization, we only go with safe copy and safe ideas. We could use user testing to try some of those crazy ideas that pop into your brain on Friday afternoon when things start to loosen up and see if perhaps this is an idea that could fundamentally change the performance of that campaign or better yet the entire online business. So these are the real reasons that data is successful beyond just the bottom line. So those ideas need a place to live and a place where you’ll go back and visit them and can rank them. We call it our hypothesis list, we do it for every one of our customers, and it will gather hundreds of ideas.

18:04

Um, so you just need a process by which you rank those. We use my own process. Many of you are familiar with the ICE framework. What is the impact of an idea, what do we think an impact will be, what do we think confidence will be, and then what level of effort would it take to test and implement that idea. These things are fairly well documented on the web. I think that you have a blog post on VWO website for that. These ideas though, hypotheses, this is the currency of what we do and so your process needs to start there. And once you have an idea you stop and think – how could I test this hypothesis? Can I do with user testing or does this hypothesis need to go all the way to an A/B test?

Rahul: So Brian a couple of follow-up questions on this. So all these tools that we’ve mentioned serve different purposes, right? So for example, we would want to use a survey tool to directly ask the end users, what their requirements are and probably we would use a tool like session replays or heatmaps to actually see the behavior of the user right? So do you think that they sort of serve different purposes or can they also complement each other? 

Brian: You know, you choose the tool based on the question that you have. So let’s say you have a landing page that isn’t performing. You want to start to understand what the issue is. You might first go to the heatmaps and look at the scroll map and see if people are abandoning before they’re reading all of the content, or you might look at the click maps and see if they’re clicking on things that they shouldn’t be clicking on. You might not find anything in that so then you’ll want to buckle in and get a cup of coffee and watch some of the session recordings and see if you can detect some confusion on their part or even a technical issue, those often come up in the session recordings. 

So it depends on the question you’re asking. I can use a screwdriver to screw a light fixture into the ceiling. I can also use a screwdriver to screw a wood nail to build a deck. So I’m using the right tool but for different questions and so that’s the way I would approach it. If we get we get too wrapped around the axle on we should be using these tools and try to invent a reason to go and see them, I think we end up wasting time. 

Rahul: So I think that’s a wonderful example that you’ve given and also I think our viewers will get a lot of good insights from this question specifically. So now the follow-up question around surveys – how do you think you know, we should design surveys. This is another very common question that a lot of our customers ask us – what should be like the right format. Sometimes you know we end up creating a lot of questions. So what should be the right approach to design a survey.

21:11

Brian: Well, let’s talk about the spectrum of surveys. So again, if you’re coming with specific questions, sometimes a one-question survey is plenty. My favorite tactic is what I call ‘Thank You Page’ surveys. So, when somebody on your site completes signing-up, they finished buying, on the thank you page or the receipt page you pop up a question and it is ‘what almost kept you from buying today?’. You can also use this to answer questions such as we have a client right now that the thank you page survey is asking ‘what other websites did you shop before you purchased from us?’. Because one of the challenges of surveys is getting enough response, especially if you’re using these on-site surveys. On the thank you page a psychological effect called ‘Liking’ kicks in and so you have a much higher completion rate. The psychology is – I’m not an idiot. I don’t choose idiots.

22:11

I chose you, I just bought from you. So you’re not an idiot. Therefore, I trust you more. And so the completion rates get higher. We’ve asked questions on the thank you page surveys, ‘Why did you buy this product over another product?’, and had ended up doing an A/B test that was a significant win for us. For longer form surveys, you know, I think that one of the mistakes we make is asking a lot of interesting questions, but not questions that are tied to a fundamental question we have. So my advice would be to I mean, I don’t want to dive too much into survey theory but only ask questions for which you’re going to be able to take action, and usually those are specific questions that come from your hypothesis list. And if you’re asking customers, you can be a little bit more direct. If you’re not asking customers then you need to, you have to work really really hard not to lead them to the answer that you want. And when you’re running a survey, you do all sorts of things that leads the visitor to the answer you want. You’ll put the answer you want to hear first. So everybody reads that and it will skew your results. But I guess I’m going to repeat myself – go to the surveys with specific questions from your hypothesis list and use the survey to answer those as opposed to getting general knowledge.

23:44

Rahul: So when you said that, you know, even one question works really well, so I think you know I completely agree with you. And in fact, you know, I have seen Google do this very well in Google Analytics. So a lot of times I get this single question where they asked did you know that this feature existed? I mean, it’s a very simple one question answer – Yes/No – and I will, almost every time, answer that question because it won’t take a lot of time and at the same time, you know, they’ll get a lot of good insights from this. So I think you know even single questions can do really good wonders for your product.

24:20

Brian: I agree a hundred percent and when you’re part of a team and everybody’s like I was just on our competitors website, how come we’re not doing X, how come we’re not doing this you can say well, we don’t know that’s working for them. But I know how to collect some data that will let us know if it’ll work on our audience and that’s when you add that idea to your hypothesis list, prioritize it and pick a tool or a couple of tools that will either reinforce it or demonstrate that it isn’t particularly good idea for your audience. 

Rahul: So do you think you can share any interesting ideas or projects that you might have worked on regarding surveys  where you got something very interesting out of the survey? 

Brian: Yeah. So, I think we were working with a company called Automatic. They got bought a couple of years back by Sirius XM radio, the Sirius Corporation, but they were selling a product that plugs into your car and connects your phone to your car’s computer. So if you’re accelerating too fast, it will tell you if you’re accelerating  too fast, it will track where your car is. If you have an engine light on you could pull the codes. They even had a feature where you could find a mechanic if you wanted to go and have that service. And they had a light version and a pro version and the pro version actually had its own 3G connection to the internet whereas the light went through your phone. So this enabled a lot of things like crash alert – if you’re in an accident your phone gets smashed the the device could still inform the authorities and bring help.

25:55

Everyone was buying the light one though instead of the much better featured, great value pro version. And so on the thank you page we said tell us why did you buy light instead of pro? And we had a hundred fifty four responses in two weeks. It was very gratifying but we quickly found out that the list of features underneath the two versions was not helping them choose. In fact, it was creating confusion and our brains will default to the easiest choice, which is the cheapest or unfortunately for many of us the back button. So we simplified that list and only focused on the things that made sense. That was […] because you want to show all the features on these pages. It’s really not what they’re to help us choose. We took that to an A/B test.

26:48

We saw a 13% increase in conversion overall just from helping them choose, and it was a 24% increase in average order value which meant they were buying more of the more expensive ones. So that’s a situation where we use user research to develop a hypothesis, created an a/b test that tested that hypothesis, and were able to grow the business.

27:13

Rahul: So I think that’s a perfect example of doing something as simple as you know, running a survey and at the same time being data-driven to get results. Yeah, so Brian, one more question around you know, how do you think internal teams can sort of you know contribute to doing user research? 

Brian: Well, it generally happens two ways. A lot of the times we brought in by a champion who is woke, who says – ‘You know what, I should be making better decisions before we launch our our tests’ and so they’ll bring us in and they will begin to kind of educate the organization sometimes from the bottom up. I think though that leaders in organizations should ask themselves – How amazing would it be if I had an entire team of experimenters.  People who had the tools and were given enough incentives to test some of their ideas as they’re going through design. I mean our jobs as digital marketers are really two-fold: number one, we want to increase the sample size of brains that are informing our designs. And so that’s why surveys and online focus groups and analytics are great because it shows larger numbers of people interacting with our creative. The other job is to increase the quality of that sample so we can start off with focus groups, which is these Pretenders and Liars that we bring in. And once we feel good about design we can go ahead and launch a test and see through analytics how people are interacting with that, or see through heatmaps and session recordings how people are interacting with that. Those two things are of no surprise to anyone. We already understand how sample sizes work.

29:14

Having a team of people who are not only motivated to do it, have the tools and tools are very inexpensive and easy to use and they’re getting better every month. That’s what I would encourage. So the next time that an agency comes to you and says, alright, here’s the redesign, here are three mock-ups, pick the one you like, say to them – I don’t want to do this. Why don’t you go collect me some data and tell me which one of these is the one I should pick, which one of these is most likely to have the highest conversion rate when we launched it. That changes the conversation, it changes the relationship and it’s going to make your marketing better. So I say leaders, encourage your experimenters.

Rahul:  That makes a lot of sense and you know, user research in itself is a very broad term, right? I mean we have spoken about, you know, user focus group who might be sitting in your office and then they are so many analytics tools out there different capabilities that can help you do research user research.So what do you think how do businesses understand user research and probably what do you think user research is not?

Brian:  Well, that’s a great question. So I think there’s two things that I see whenever we run into an organization: number one is testing and user research is seen as difficult, something experts do, somebody else’s job. But when we come in and we start presenting data that we find in their chat transcripts, we start finding things in their ratings and reviews, we’re listening to the phone calls of their business lead generation and sales. The rest of the organization begins to say – ‘well, how come I don’t use more of this when I’m designing or conceiving of the product and those sorts of things.’

31:16

So there is this initial hill to get over which is user research. That’s something other people do too. I mean it kind of infects an organization. We had a client that came to us that we’ve been working with for a long time and said listen, we’re buying billboards on the side of the freeway. We have these two designs. Can you give us any idea which of these designs is going to work best on a freeway? And of course we were able to do that. So it can really spread far and wide and I think that takes a little bit of leadership allowing that to happen. But once once you start getting decisions made and stop asking what should we do, and start asking how do we collect some data on that? Your fortunes are fundamentally going to change.

32:07

Rahul: So yeah, I mean, you know, I think it also depends on the kind of business that you have and for instance, you know, like VWO is a SaaS business and we as product managers work very closely with sales and marketing teams to collect what feedback the users are giving, like what kind of support tickets they are raising, and what kind of feature requests are they raising. So for us probably that is a very important part of doing user research. So do you think it depends on the kind of business you have and the kind of needs you have as a business.

32:43

Brian: I’m sure it does. If we had been successful with so many different kinds of business, I would feel more strongly about that. I think less what kind of business you have, and more what stage you are in as a business. Actually the hardest clients we have are those businesses that are doing very well. Their marketing team has brought a lot of traffic to the online business, it’s converting at a level high enough, and at a profitability high enough that they’re making money. So, do we really need to worry about user testing and do we really need to worry about A/B testing and all this list of hypotheses, and doing these experiments. That sort of apathy though is going to bite you in the butt when your competitors start using these techniques and moving past you and taking some of that possibility away.

33:41

It is something that is both technical and cultural. So there is this learning curve that you need to get started on. So, I would say that it’s less about what kind of business you run and more about where you are right now. Something will happen – your Google ads will stop working or your organic traffic will start to decline or something like that. Then you realize – Oh my gosh, we really need to make the most of what we’ve got. And user testing and A/B testing is the best way I found to do that. 

Rahul: So that makes a lot of sense and actually that’s a very valid point that where you are right now also matters a lot. So, another challenging thing that most people face is  getting the ROI from doing user research. At what point in time you can call it a success. So what do you think are sort of top three metrics that you know indicate that your your research program is now successful. 

Brian: That is a very good question because of the nature of experimenting. There was a study done some years ago, I think by one of your competitors. It showed that only 1 in 7 A/B tests showed a positive statistically significant win. We can’t work with percentages like that. But underneath that, you know, we do spend a fair amount of time on analysis and at the end of the analysis we’re like nope, there was nothing here. It was an interesting idea but there’s no evidence that that idea is going to improve things, that we don’t see an effect in the analytics on that. Getting comfortable again with spending the time … the positive side of this is that human beings are curious.

35:41

And you’re going to have much more satisfaction and I will guarantee better business results if you let your team feed their curiosity. Give them some tools so they can explore their curiosity. Let them waste a little bit of time going down some rabbit holes because when they do find those winners, they can be business changing, they can be fundamentally important to your visitors. So I don’t know if that answered your question.

Rahul: Yeah, I think that actually answered my question. And in fact I feel that there’s a big learning curve as and when you start doing research, as in when you start trying to understand what the customers are doing, it’s slowly and gradually that you learn more about your visitors and the right methods that are working for you and that are sort of not working for you, right? So, a question related to that – are there any interesting books you are reading right now or probably something that you can recommend to our audience that can really help them actually learn more about user research and conversion optimization in general? 

Brian: You know, the thing that really sealed it for me was Stephen Krug’s book ‘Don’t Make Me Think’, and I’m forgetting the title of his follow on that tells a little bit more into that. But his approach was very much like mine. It is – just start asking questions.

He was very focused on focus groups, very focused on getting executives involved. And I think that’s a point if I can take a quick aside that I think is really important. As you’re doing these experiments don’t be afraid to be include scrollmaps, and heatmaps and snippets from session recordings in your presentations because we all know how to read these things. And I think the more that you can expose your team and your executives to those, the higher your credibility will get. They’ll still see visually what you’re learning and it frees you to do more of that sort of experimentation. But I would start with I would start with the Krug book. I think it’s a great intro into some of the basic ideas for what to test and makes the case for doing online surveys and collecting data.

Rahul: So I think we have covered a lot of interesting questions in this entire session and our audience would sort of love to connect with you and ask you more questions. So how do you think our audience can connect with you? 

Brian: You know what I don’t have a problem with your audience sending me an email. So if they were to send me an email at brian@conversionsciences.com, I’d be happy to answer any questions they’ve got. We are a teaching organization so our blog has articles on the things that we’re learning from our research, it gives you great ideas, you can go download a hypothesis list with our ranking algorithm on the site. And so I think conversionsciences.com is a great place to start if you want to become an experimenter. 

Rahul: That was a wonderful conversation with you Brian, and I think you’ve given a lot of insights to our audiences and now everybody knows how to connect with you as well. Thanks a lot Brian again for having this session with us and helping our audiences learn more about user research. So I’m sure you know people will find a lot of useful information from this session. 

Brian: I hope so. Go out and do an experiment. That’s my call to action. Go test something go try something. 

Rahul: Yeah. Yeah perfect that perfect. Thanks a lot Brian and thanks a lot guys for joining us in the session.

Speaker

Brian Massey

Brian Massey

Conversion Scientist, Conversion Sciences

Other Suggested Sessions

How to Guarantee Innovation & Sales Growth Through Experimentation

Learn from Amazon's Rose Jia on driving business growth in any economy through investment-focused mindset, experimentation, and innovation.

The Science Of Testing At Trainline

Learn how Europe's leading train app delivers the best ticket booking experience to its users through online experimentation.

Conversion Research

Rich talks about various methods of doing conversion research and at what point should they be done.