• +1 415-349-3207
  • Contact Us
  • Logout
VWO Logo VWO Logo
Dashboard
Request Demo

Designing for Scale at Ackermans: UX, CRO, and Experimentation

Jandro Saayman and Emily Isted share how UX and CRO teams collaborate to drive meaningful improvements in retail e-commerce. They explore practical testing examples, how behavioral data informs design decisions, and how experimentation helps teams better understand customer needs. The discussion also touches on mobile-first challenges, cross-team collaboration, and the growing role of AI in personalization and analysis.

Summary

This session explores how UX design and CRO teams collaborate to optimize a large-scale retail e-commerce experience. Through real testing examples, the speakers discuss how behavioral data, experimentation, and design strategy work together to improve conversion while maintaining brand consistency. The conversation highlights the realities of testing in a mobile-first, data-conscious market and the importance of balancing speed, performance, and user experience.

Key Takeaways

  • Behavioral data plays a critical role in shaping design and testing priorities
  • Simplifying user journeys can significantly improve engagement and conversion
  • Mobile-first design is essential in retail markets with data and device constraints
  • Strong collaboration between UX and CRO teams leads to more effective experimentation

Transcript

NOTE: This is a raw transcript and contains grammatical errors. The curated transcript will be uploaded soon.

How’s it everyone listening at Convex? Really, really happy to be here today. My name is Emily and I’m the Director of CRO, Conversion Rate Optimization at Hype Digital.

We’ve been running CRO at Hype for the last seven years. I’ve been on the team for the last four years.

Yeah, I’m just live, breathe, eat, leap everything, AB texting and CRO. And super super excited to be joined by Jeanne Ro.

So please welcome him to the stage and Jean Ro I’m going to hand over to you just to intro yourself and say hi to everyone today.

Awesome. Thank you for having me and for being part of this experience.

I am Jeanjo, and I head up user the user experience design department, at Ackermann’s. Just for those who are unaware of who Ackermann’s are, they are a, one of South Africa’s longest running, retail companies, and it’s a legacy brand.

And my department and I are the youngest department in the organization leading up the e commerce space. So it is quite an exciting time to be contributing to legacy brand in the way that we are and taking it into the digital age.

Awesome. Thanks, Shandra.

So I’m just gonna start off kind of wanting to understand how you’ve been maintaining Ackerman’s design consistency with like the need and the desire and the progression to bring in the rapid experimentation that we started working with you on.

Yeah, it’s a good question.

I think starting off with the whole concept of AB test and just in general testing, it’s a completely new space within the company. And having started or having launched the e commerce platform last year with Shopify Plus, we are quite early adopters of Shopify and just e commerce standards.

So there was a big desire and need for us to start trialing our ideas and the best way forward is to run the AB tests that we have been running with hype over the past year and a bit. And I mean, I feel like the need to test is in its own is trying to just get understandings as fast as possible, but also meaningful, so that we can have a better understanding of our customer and also how to not just provide a service that is efficient and meets the needs of our customer, but also to understand and inform stakeholders in the organization.

And so AB test plays a pivotal role in, you know, really adding value that maybe we would have never had. So we try to kind of keep that up and prioritize them a little bit better as we can.

Yeah. And I think, Chandra, I mean, we work closely with you, obviously, on our CRO team in terms of, like, the strategy and then you, you know, briefing you in into the design. And obviously our kind of head of UX UI will step in and take off some of, like, the less intricate load. But I think it’s been so pivotal and essential for us, having your eye and your actual design hand on bigger type experiments, especially redesigning of the homepage, for example. Because I mean, you live, literally in terms of like Ackerman’s a CI, kind of that’s like taken nature to you now. Yeah, we really appreciate that aspect as well.

Thanks. Yeah, it’s been an interesting, you know, space to explore, especially, like, even within my own career being at Acumen.

The nature of all the intricacies that play a role in leading the customer to convert is quite interesting and fun at the same time, analytical, and it has a mixture. It’s like a blending part of different forms of trade all shared in one space, which is something again new for the business because everything has its structure that is brick and mortar, which is what, you know, retail is regarded as, our physical stores. So taking those various departments and bringing them onto one real estate space on the screen is quite interesting.

Yeah. Exactly. A challenge, but a good challenge.

Awesome. Okay. So kind of next topic or question.

What is a recent AIME test that you guys that we’ve run together where UX design choices have really impacted our conversion rate? And how does secondary to that, what does it look like in our team collaboration, your team and our team collaboration during an ideation session?

So firstly would be that, yeah, that recent AB test that impacted conversion that you’d like to mention.

Yeah. I Homepage being one as you stated earlier, using that as an example.

Using our different dataset tools that helps us inform the decisions, but also just Shopify, you know, analytics tool that we have access to.

Kind of really helped us understand what are our user entry points and how do we optimize that experience better in South Africa, especially within the ecommerce space and our target demographic.

There is a large LSM group that are part of that and have limited access to various other means such as data, cell phone devices, and so forth. And and also informing them and educating them was a big factor too because some people might have never shopped online. So there’s also an educational mechanism that we need to provide, but still ultimately helping the customer operate and function on the site to get to convert.

The homepage in particular, for example, we’ve noticed that you know, as we created those ideations of ideas from the stat that we would give it, sort of monitoring, that minimizing the journey from entry could allow us to get the customer into the shopping funnel much more easier and more efficiently than providing them with maybe additional campaigns and things that maybe they weren’t feeling was necessarily important for them.

And so I think with our collaboration on not just the homepage, but also like other forms, which was our sticky menu navigation. Oh, let me add this, our menu navigation on mobile, where we first had mobile.

We had images for every department such as women. Had an image, and men we had an image. And this actually came from Hype and your team in providing us with some stats around, you know, maybe we should test this idea based on some of your findings that you had presented in our initial, I think our first initial session that we had together. And we, you know, took that on and we felt that’s a good interesting test to see.

And that had a massive improvement and therefore we then implemented a listing style nav menu versus an image based navigation. And so with our collaboration, we share the same space where the ideas are not just coming from one person, but coming from us both. And though we work independently, I feel like we work very much collectively in shaping our tests and ideas. Then obviously going through at the final that we have put together as a team to say, okay, cool. This is how we’re gonna go. This is the PDFs we’re gonna run at. These are our metrics that we’re checking.

And then obviously, do I check-in around how are they meeting our standards or not?

Exactly. It’s quite quite fascinating. We have kind of weekly status or weekly check-in meetings as a team. And then often kind of the hype team will the strategist will come and present four to six new test ideas a week based on actual data points that we see using behavioral analysis tools on the site like VWO Insights as an example.

And then kind of the Ackerman’s team based on our ideas will then kind of pipe in and then dig into our ideas a little deeper and make them like form a little stronger foundation based on like knowledge that they know about the company that maybe we didn’t have prior. So yeah, collaborative which we really enjoy.

Kind of really fiftyfifty give and take which is kind of our favorite type of client.

Yeah. Okay. Cool. So now let’s dive into some customer segments and some more AI powered conversation.

So as I mean, as you mentioned, you know, at Ackerman, we’ve got quite a diverse customer base across South Africa.

How are we currently or how are you guys currently using or planning to integrate AI to personalize the shopping experience a little bit more based on like the differing segment?

That’s a good question.

At Acumen’s, as I mentioned, you know, when we started our interview, we had that early adoption phase of e commerce journey. And we are making sure for the most part that our basics are covered. When I say our basics, it’s our site speed. It is, you know, technical limitations that we need to just consider based on how our internal systems are set up. So there’s a lot of core fundamentals that we are working very much in perfect well, let’s say perfecting in and improving so that we can be as optimal as possible. However, when it comes to the AI conversation, we are starting to explore means of, you know, integrating some search functionalities that allow for AI to be a factor Yeah. And some shoppable personalization when it comes to carousels that customers are shopping carousels with all the products that they can now see that are more aligned to their shopping behavior.

So there’s a lot of room and opportunity for, you know, the the implementation of AI and and the systems. And obviously, with Shopify, there are quite a few tools that already have the Shopify functionality available in some of their products and services. So I think we are sort of taking phased approaches in understanding the power that AI will have within our shopping funnel or as a business funnel, and then also as a customer experience funnel as well. How are we integrating Integrating the the use of AI to better optimize and also to pivot convert customers.

You It would be lovely if there was an AI system that could really audit the site in ways that maybe we are maybe not able to or missing because of certain limitations. So there’s still a lot of room and opportunity for growth within the AI space, especially, at the stage that we are. And we’ve come a long way, not not to say that we are any far away, but in one year launching, we’ve made such significant progress.

So I think definitely within the next year by now, we would see quite a quite an enhancement with AI personalization, I mean, I have to say. No. Cut Yeah. Is the most.

Yeah. No. Absolutely. And, I mean, just in terms of, like, how that been conversation with us not so long ago and we came to offices and we had that really nice kind of strategy session.

And Yeah. Wanted to unpack how we can actually create not necessarily a different complete website, but how we can actually personalize the website for new and returning users. I mean, sounds cool, but it is such a big it would make such a big difference in the sense that as as you mentioned earlier, we we really need to educate those new visitors on shopping online. You know, they click through from an ad, they’re not on the site, they’ve never shopped online before.

How can we kind of personalize that experience for them versus a returning visitor, someone that purchases from us for their family every, you know, two, three months? So it’s definitely kind of a hot topic for us at the moment and something that we are very excited to kind of dive into.

Yeah, I agree. I agree. And also further to that, it’s as well like with I think like tying into the segment as you mentioned earlier was returning in new customers, new customers that don’t transact with us transact and same with returning. There is a different sense of, an offering of experience to those different segments. And, you know, where we are in that journey is still like we’re still learning to acknowledge what that means and how can we better with potential use of AI help, you know, drive certain customizable features.

Maybe like almost for example, is this your first time shopping when someone lands and we picked it up and they say yes, it’s like we offer them some sort of experience that maybe just helps them. There’s all these different ideas that we could explore because the new user bases and reverse return, they are large and new is significantly larger than returning, which is always interesting. Yeah.

Know, there’s also other factors that are not necessarily just AI, but like brand loyalty and brand trust. And people that know about the brand, but they’ve never they shop in store, but they’ve never shopped online. And it adds a different layer to the experience because it’s like, oh, can we trust it? We don’t know. We don’t know Ackerman’s in this way. So there’s all these different stigmas as a as a brand, I would say that we would also need to, like, overcome and keep on pushing the needle over.

No, exactly. Just I think a last point on that is the power of AI and the different tools we’re using, our different behavioral analysis tools and how that’s helping speed up our analysis of actual AB tests instead of having to watch physically seventeen screen recordings. We can use AI to sum up those findings. So that’s if they’d be powerful in your and our analysis to be able to find insightful data backed new hypotheses based on the data in those tools. So yeah, it’s very, very exciting.

Great.

Okay, amazing.

I mean, yeah, essentially what role does behavioural data play in our design decisions?

I know we’ve kind of touched on this a little bit already.

Maybe we can advance on that a little bit.

Well, I think firstly the user behavior on a platform that’s digitally driven is fundamentally the core and the basis of us actually understanding how to better design and better equip our ex our our moving parts, let me rather say, so that, the car basically can move forward, as swiftly and easily without any hiccups, without any breaking points, and so forth. So the behavioral analysis or data that we are able to receive such as user click rates, heat mapping, all these various metrics that or that we are able to utilize and then metrics such as like click rate, bounce rate, exit rate, you know, very useful information that informs what is happening, what is happening in the eye of the customer.

Another great, you know, behavioral sort of feature would be like being able to watch session recordings of customers engaging on the platform. It is quite a tedious process to really, you know, go through each and every single video, but it really helps showcase how our customer is behaving on the site, how they are enjoying, are they having frustrations? Are they experiencing rage clicks? Are they experiencing, slow load times?

So there’s a world of information that we have access to. However, we have to have a strategy as a as a UX designer and as a team, as a CRO team just to really, like, facilitate that entire process. Like, where do we start? What are our goals?

And then what metrics are we using to the step to determine whether our goals are being achieved. A lot of the time we know, from an organization perspective, we focus on revenue as a goal because ultimately, you know, that pays the bills and gets us moving as an organization.

But there are so many other fundamental experiences that we need to cater for such as quality of experience. Even if someone doesn’t necessarily convert, are they enjoying coming on to see products? Is all our products available in the various sizes, online or are these only a few because that kind of plays into the quality that they might have of them returning and actually saying, okay, I would like actually come back because they they have my size, for example. So there’s a lot of important seed to just recognizing that we have these metrics and just how we’re gonna sort of direct them and how we’re gonna sort of make sure we are focusing on the customer’s experience more so because I’m here for the user, more so sometimes than maybe the needs of the business.

No, and Jean Ro just to your, such a good point you made about, yes these behavioral analysis tools are incredible but you can only rarely get the most out of them if you do have a strategy, a KPI, a roadmap in place of when you dive into them to actually start analyzing them because otherwise there’s just kind of an endless maze it’s so overwhelming the data that’s available.

Yeah, it’s kind of nice how we kind of, you know, quarter come together, have a bit of like a strategy, a strategic type workshop in the sense of setting our KPIs for the quarter, for the next six months, and just ensuring that everything we do is then obviously optimizing towards reaching those KPIs.

Correct.

Yeah. Amazing. Okay. So in terms of our, obviously, traffic device fletch, right? Heavily mobile focused.

And as you mentioned, we’ve got kind of our target market. We are really focusing on data conscious connections a lot of the time.

So when we are approaching our testing performance versus experience trade offs in the sense of when we are designing our tests, right, we’re very mobile first in terms of our design. Obviously desktop is very much there. We’re very responsive in our way of testing.

But yeah, maybe we can touch on how we have enough traffic to be able to be running individual independent mobile and desktop tests.

Touch on that a little bit and like how you think when you’re designing, is it mobile first? How does that process look for you?

Good question. Yeah, I think I’m going to tie it again back just to the organizations stepping into this environment of online e commerce, let me say. It’s that desktop seems like the most, you know, visually striking thing that we focus on, especially maybe people that are informing the decisions of how to proceed with campaigns, banners, and all those kind of things. And when the ball was rolling as we launched our platform, there’s a lot of, oh, mobile experience is maybe degraded in some way because we are so focused on making sure that the banner creators for campaigns are shouting and visible and they take up space.

And so what happened as we were developing the site, it’s really, you know, tailored to working towards, obviously, mobile first. I predominantly focus on designing our mobile experience as a means to, you know, validate choices and reasoning as, obviously, our pool is an eighty twenty split between mobile and desktop of our user base. And I think where we are experiencing some trade offs versus some of performance, it’s potentially where we can reduce not just what’s happening in the front end, but also happening in the back end. And obviously from a technical perspective, there’s not a tech developer or so forth in our conversation today.

But, you know, they play the fundamental role in that area of the platform, as well in optimizing and making sure that, you know, for the most part, we could focus on reducing the amount of banner creative because of image size, and video playbacks and so forth. Are are are images optimized properly and make us for product images as well as creative banners? So it’s almost the the check off list that we would need to have that we have in place and also still learning in terms of mastering better. So knowing that these are the kind of things that when we launch our products online, these are the basics fundamentals that we have to focus on making sure.

So like our l c p our our our LCP score, which is, you know, like a largest content for, pain point or of piece of creative. It’s the largest on the page. You know, how do we better optimize that? Why is that happening?

And, you know, we wouldn’t know this if it wasn’t also for our user behavior tools. So it’s also quite important how it’s playing a role in informing that because mobile can be a lot more challenging. It’s amazing. But mobile is accessible.

It’s dependent on data. So someone could be anywhere in the country and, you know, they’re using their data versus Wi Fi, for example. So as desktop often has a a direct plug in or in a Wi Fi set environment, that’s often more catered to a faster at work and so forth. So there’s all these factors that we got to really consider when optimizing.

But to answer the question very much more so, it’s to optimize a mobile experience first. And then secondly, what does that look like? Where do we make better improvements, selections of functionality?

And also our different Shelf P5 plugins that are all speaking to each other legacy systems. And you know the communication again that’s happening behind the front facing experience that we offer our customer.

Yeah, sure. Thanks Chandra. That was super great insights. I think just to mention what’s kind of one of our road maps before the end of the year is to do those kind of in person usability focus groups where we can actually really gain actual insights from customers that come into the office.

We’re gonna interview them. We’re gonna run live usability tests on a WiFi free network. So actually really sip with them on their mobile devices and really kind of do some live testing, which will be exciting. I think we’ll gain really kind of new insights that we’ve never had access to before.

Yeah, which is, you know, I am super excited about that. I think it is I know that I think I know that it is gonna add so much value to informing decision making and also actually getting a real time human interaction. I think a lot of our job or my job title or role or my environment that I work in involves me working on trying to have a computer and the and the output that I’m creating is for a computer digital screen. So however, there is a human that’s engaging with this piece of, you know, technology, whether it’s an ecommerce platform, which is my industry that involved in. But I mean, for anyone on the call that’s, you know, applying to SaaS product environment or any digital means, even a simple website that is, you know, informative, it requires a human to engage with it for in order for it to be a success. And so bringing that part of testing into organization is really, really amazing.

And, you know, I think that alone is a different segment, a different session type. It’s a different it’s a whole different operating sort of user test. And then what we’re doing already with our tests, our AB tests that we are doing.

So obviously they tie hand in hand, but bringing that human element to the testing phase of it would be wonderful and I’m so excited.

Absolutely. I mean, it’s the glue that kind of brings everything together. As you said, you know, ‘re working digitally to provide a hopefully a concrete like human experience. So yeah, very excited for that.

Awesome, okay I’m gonna move on. So would be interesting just to understand when it comes to kind of design velocity versus like testing validity. So what is your approach? I just want to understand a little bit more how do you manage kind of something we call design debt when running multiple experiments? Because I know we move quite fast paced in terms of our ideation and we might have a women’s campaign page that you’ve got four to six different actual testing ideas that you put together. How do you kind of manage that when in your kind of design mind approaching that?

To be honest, it’s quite an exciting experience for me.

Only because like I mentioned when we started our interview, this is a new phase within the the journey at, Ackerman’s. So fundamentally, I think we would start with understanding, like, how are we meeting our KPIs for, you know, in terms of, you know, the goals for the customer, how are we also meeting business needs? And I think when we had launched our platform, we had no backing in terms of what was going to be the most effective layouts, most effective color, buttons, toggles, switches, iconography, all these various forms of the design experience that we are offering.

So it sometimes kind of feels like there is we could go on a tangent because there are so many different directions that we can run with all these different tests we’re wanting to bring forward. And to be honest, I feel like there is method. There is like how can I say there is a method to the madness Yeah? That is, and I think that is the nature of just the take that we are taking on with regards to our AB test.

And when I say it’s not chaotic, just for anyone listening on the call, it is just simply there is so many opportunities for us to trial things. And, you know, our head of ecommerce, he is a big supporter on trialing experiences for our customers so that it will better inform all the different decisions so we can be more concrete in making selective choices moving forward. And obviously, when it comes to, like, design, date, and, like, the overstimulation of what we could provide, I think it’s just knowing that we wouldn’t run maybe a homepage test now and then the following week run another homepage test, for example, or do two different layouts consecutively over time.

Obviously, is, know, that would be a structured approach to something that lives within the same space. So I think there is some reasonability between just testing what comes to mind versus putting it into a a strategy approach. So when we had first started, you know, we focused on our core functions, which was navigation, search. Those two, when we initially started, we tested other, you know, other things, but fundamentally search and navigation was fundamental.

We made that search bar sticky on mobile, contributed a massive upliftment and conversion rate increase, and then the menu that we discussed earlier. And then as we started testing those, we moved where we currently are on our core pages, which is the homepage, PLP, and PDP.

And so we’re in that phase of our testing. And then we’re going further granular where we are moving more to PDP and optimizing the PDP experience for the customer. So potentially down the line, adding quicker checkout options.

You know, so all those kind of things just moving a little bit from like a sense of like what is a site structure important in the funnel to where customers seen product, and then maybe some functionality such as forms, input fields, those kind of things. How do we, you know, kind of better equip that?

So, yeah, I think that kind of is sort of the logical approach that we’ve applied with our tests and just design in general.

Yeah. Amazing. Thanks, Chandra. Happy to know of AB testing. It’s like of endless in the learnings. Know tests might lose, might not increase end of funnel conversion but the learnings from that singular test it’s kind of never ending and like the re hypothesis that we can then go into.

Maybe small little takes, maybe there was a nice engagement metric that increased on, say if we ran your navigation type approach and then how is that gonna benefit the business in the sense of bringing maybe a certain category to the nav menu, etcetera. So, yeah, it’s super super, super interesting.

Yeah. And and you know, it’s also, like, one test that we you know, is how do we test creative ideas? So, like, is this banner creative?

Same same structural, you know, the banner is the same width and same height, but is it from a creative perspective, they’re two separate and which one works better? So that’s also a phase of AB testing or testing in general that, you know, we would I would like the our experience to evolve into as well where we are able to understand the creative online because it also, again, helps brand, you know, inform creative decisions when speaking, not just from the e commerce point of view, but all the channels that support the online platform. So the email is, WhatsApp communications, our social media channels. You know, it’s the AB tests are for the online e commerce shopping experience, but it informs such bigger brand decisions or helps inform, the, you know, the other stakeholders involved in making decisions that tailor the focus a little bit more on online so they can also know what works and what doesn’t.

No, exactly. And that’s actually intro to the next discussion point around thinking a little bit again about AI and various tool.

Essentially like how we can accelerate design variant creation by using tools. So currently, obviously, we’re very kind of hands on manual designing at the moment. But I mean, when we start bringing in the creative testing that you just mentioned, testing different color backgrounds for example and testing, then we can potentially start looking at an AI tool when we can actually incorporate it into a design that’s simple enough to allow us tools, I think.

But for right now, I mean, you can touch more on this, but I think you’re pretty much designing at the moment, Tim.

Yeah, that is great. There is a, quite a, you know, sort of a manual approach to the way we focus on, you know, creating design. Just for, you know, everyone on the call in the organization, creative and so forth sits a lot within the marketing team and that kind of funnel towards online. So I only provide an instant guidance for, you know, creative execution in terms of, you know, what could potentially work based on what, you know, is being created by brand.

But, you know, with regards to actually designing within the platform, there’s obviously design library that I work on updating that gives, sort of efficiency to the way all the different components functions. And it’s an ongoing library that one would need to revisit and recreate like create, delete, replace, and add on because everything is all evolving.

And since, like, we’re only a year and and a half into our, you know, ecommerce platform, the design library is something that’s still growing and I think requires still and builds some more adoption from our marketing team to bring creative element, how to position certain creative that works in this way but doesn’t work in that way. This is how it should be placed. So we’re still in that journey of establishing that. But at the current stage, it’s quite manual.

Nice. Nice, Shandra. Thanks. Okay. Interesting now just to get kind of unpack a little bit, maybe a potential kind of test that we got really nice learnings out of that maybe failed in terms of kind of end of funnel conversion.

Any test learning or takeaway that you kind of took into reshaping a design, a redesign or maybe a strategy going forward based on something you learned in the test?

Do you have any scenario like that at all?

You know, there’s a few. And I think one that just stood out now recently was our sales

Sort of the banner or label, sorry, that we implemented that were dynamic that informed the customer on the percentage that they would get off instead of just saying the word sale on a red label. And, you know, it wasn’t necessarily a failure because the test was tested with the users in seeing a percentage value as a discount help them, you know, in making a conversion or add this product to cart. So it was interesting to see that they wouldn’t necessarily maybe focus on the terminology or the percentage.

And if something was on sale, they would still shop it regardless or still get it or or, you know, not at all. So that was one that really, like, kind of stood out quite recently. I think another another one that we I’m just trying to gather my thoughts here because we’ve tested quite a few.

Chandra, do know what is interesting? Is I was thinking now inverse to that, so like learnings from positive wins. Let’s move towards So what has been quite fascinating is actually taking like we’re busy testing on the moment on the campaign landing pages for women and actually having bubbles above the fold, like showing the various categories. So I know we’ve tested like a feature like that as an example at various other touch points on the site.

And that was kind of potentially successful. And now we’ve kind of taken it into the design of different touch points, which have been interesting. So it’s always worth, like, in deploying something that works and seeing if then kind of it does work again across the entire funnel. So that’s been quite, I think, interesting for us to see.

Yeah. That was that was out of the spark, it’s it’s like kind of a super surprising, but also not at the same time. I was surprised how exponentially it had grew from where it was positioned before.

For context, the home page, those bubbles lived at the bottom of the page of of it’s all the SBUs. And what that kind of it was still a high clickable area even though this is right at the bottom. So the test or the theory was moving it to the top, how would it, you know, convert customers? And we saw such a spike by moving it to the top, which again speaks to, like, just the user behavior. Customers are wanting to get quicker into, the PRPs and p, and PDP journeys so they can see the product.

So that was really interesting. And, again, it’s, like, similar to homepage, which was, again, such a successful test that we ran, and hence why we moved forward in implementing that in the process of implementing it.

And, you know, another test that we wanted to run was which we didn’t necessarily do as an AB test, but it was moving our checkout flow from a multistep process to the single page that we have on Shopify, because we focus on there being such a level of steps or quite a sort of few steps needing to get the customer to the end. So shortening that helped us, you know, get the customer over the line much faster.

There’s some things for everyone obviously listening to this as well. And and some some solutions, there’s always a great need for testing, is you know? But some you just know based on just an experiential base and also journey and experience to make a choice that you know will benefit at the end of the So it’s always using some sort of intuition, but also using data. So like kinda combining the two.

And the same is with an AB test or testing ideas. It’s you know, if one was to speak in the room and we start, like, how do we start this idea? What is your gut feeling and what is the data showing? And then sort of together as a team, you know, bring those two elements together, to to now kind of test the right hypotheses out for the different KPIs that we want to test, or metrics, sorry.

So, yeah, I think most of our tests, ninety five percent of them were a success. And if they were failures in any shape or form, it was not insignificantly failed. It was just, well, it didn’t nearly pump up the success that we aim to have. So it’s quite interesting, yeah, that kind of sort of feedback that we had from Ati so far.

Completely, absolutely. Okay, Jeanne, I think I’m going to come into one more kind of question for you that’s a little bit of advice based for the audience listening today. Well, kind of if you were speaking to anyone in the UXUI space, the design space, the marketing space, any product space. So how would you kind of what’s a little piece of advice you would give them when you’re looking at designing for, in your case, a retail based website like Ackermann’s? How do you balance moving more towards an optimization type approach, like trying to encourage and push and motivate the user to purchase, but still maintaining a very brand centric educational type approach.

Like, how think about that approach when you’re designing, not over over kind of push on, like, FOMO Engineer.

Strategies and that sort of thing, you know, to try to get them to purchase. But keep keep it a little kind of subtle and premium like still.

That’s a good question.

I wish the other stakeholders were in the room as well.

But from my side, as I mentioned, I focus what who is the customer and what are they coming on Ackermann’s to do. And with that being the the opening statement, I try to remind everyone else because let’s organized in Ackerman’s, it’s myself and other stakeholders making that decision that you see as the end product. So internally as a team, think a lot of my design needs to be not necessary. It needs to kind of speak through the other channels or the other forms of understanding and vice versa. So, you know, the the the the idea of using gimmicks to get customers over the line. You know, how do we find the balance between that and still providing gamification as an example, but still making sure we’re not taking it away from the customer actually shopping. Or now introducing gamification, what impact does it have on-site speed?

Are all of these ideas that we need to bring in because no idea is an idea that should be left off. It should always be considered. It might not work now, it might work later, might work two years from now. But, you know, and then taking into account, okay, does this meet the goal of our customer? And then also thirdly, what are the challenges that impacts the success of this design feature that we’re wanting to implement?

So for me, kind of, you know, use that kind of sort of approach to helping others in my team also make the choice, but also them informing me, you know, like this is something that we need to do. And, you know, we are an online and brick and mortar company. I think if you’re on online only, your experience has also its own challenges, but you’re you’re tailoring towards how you focus on purely just the online experience and and every single drive goes there is much more in so how could I say, it can really push the needle when as of the brick and mortar, we need to think about what’s happening in that space so that it translates online as well. So I think for me, I just have to, you know, focus on the design and also inform others that maybe this idea might not work or this choice or this is an excellent idea.

This is how I’m thinking about going and what are your thoughts. And then in getting my, obviously, feedback from you guys as the as hype as well just to help, you know, brainstorm how to be better. I started yet, but I’m not too sure.

Or I am sure. Let’s go for this and test it.

So, yeah, it is it’s it’s a blend of just knowing that we are trying new things because we don’t have anything to fall back on within our own library of experience. So at Ackerman’s, you know, we are trialing. We are experiencing so that we have our library and our library just to go back to saying, oh, we’ve done this. This is good.

Let’s bring it back in or this is why we have what we have.

And with also our user tools, it just provides us with information that then runs to stakeholder level executive level, where they are now able to see how this piece of creative is changing the revenue at bottom line or by making this technical change, how this has contributed to the overall experience and has translated over. And sometimes it’s also great so that we can show them maybe this is not so great because these are the challenges that we’re having so that maybe it helps them help us in ways that we might need you know some sort of decision making or some more financial funding. Know there’s just such various parts to it.

Sure, yeah thank you that was so well answered. Thank you, Doctor. Chatting to you away. Yeah. Thank you again.

Thank you, Emily. I appreciate it. It was so lovely chatting and yeah, everyone, best of luck to everything, and, yeah, have a wonderful rest of twenty twenty five.

Speaker

Emily Isted

Emily Isted

Director of CRO, Hype Digital

Hype Digital logo
Jandro Saayman

Jandro Saayman

Head of User Experience Design, Ackermans

Ackermans logo

Other Suggested Sessions

UX Fundamentals for More Conversions

Join Karl for a deep dive into 5 crucial UX principles and 2 transformative marketing questions, blending humor and insight to tackle persistent online business flaws.

How Scotts Miracle-Gro Nurtures Growth Exactly How It Wants

Hailey Schraer unveils Scotts Miracle-Gro's digital transformation strategy, revealing how A/B testing and personalization drive user engagement and business growth.

Using Neuroscience to Create Engaging User Journeys

Explore Marcello Pasqualucci's innovative approach at Sky, blending neuroscience with experimentation to craft engaging, emotion-driven user journeys.