Risks of Rushed Tests and Relying on Limited Data – Insights from Jared Brown
Welcome to the third edition of our “CRO Perspectives: Conversations with Industry Experts” VWO blog series. In this series, we invite some of the brightest minds in conversion rate optimization to share their insights, experiences, and best practices. We aim to offer a wealth of insights for broad learning and application through this initiative. Whether you’re new to CRO or an experienced professional, there’s something here for everyone to learn and apply.
Leader: Jared Brown
Role: CEO at HubStaff
Location: Atlanta, GA
Speaks about: Usability design • High-performance computing • Rich web applications
Why should you read this interview?
Jared Brown, the CEO of Hubstaff, is a driving force behind the company’s mission to revolutionize workforce analytics and time tracking for distributed teams. With a rich background in development and product design as the former CTO, Jared instills a mission at Hubstaff: to empower companies with comprehensive data on their internal operations to maximize productivity. Leveraging his extensive testing experience, Jared shares invaluable insights on CRO, showcasing how Hubstaff exemplifies the integration of testing to consistently enhance its product offerings. Hubstaff’s story shows how testing can transform the development of SaaS products.
Did you know that Hubstaff found remote employees have deeper focus and fewer interruptions than those working in the office? Their study showed that remote workers spend more time on focused tasks, with 59.48% of their work week dedicated to “focus time”—a block of 30 minutes or more of productive work with minimal distractions. In contrast, in-office teams dedicate only 48.5% of their week to such focused work. These findings suggest that remote work can lead to greater productivity and efficiency. If you’re interested in exploring this topic further, you can check out their detailed report.
Driving force in your career journey
I was fascinated by the history and impact of major software companies like Apple, Microsoft, IBM, and Oracle. This fascination drove me to teach myself programming and eventually led to the foundation of Hubstaff, a time and productivity solution for global teams. What keeps me engaged is the continuous challenge and opportunity to innovate and significantly impact how companies manage their remote, hybrid, and in-house staff effectively.
Memorable A/B test and its outcome
We had two different scenarios where we were running A/B tests – trying to improve conversions for high-traffic pages in key areas.
When implementing dramatic design changes, we needed to make sure that we were not taking a step back, by regressing our conversions. So the new design needed to perform at least as well as the old design.
A/B testing pitfalls to avoid
A common mistake I’ve observed is not giving tests enough time to run or basing decisions on insufficient data. This can lead to premature conclusions that aren’t statistically significant, potentially steering strategies in the wrong direction. Another pitfall is testing too many variables at once, which can make it difficult to identify which change impacted the results. Businesses should focus on a controlled approach, changing one element at a time for clearer insights.
A common mistake I’ve observed is not giving tests enough time to run or basing decisions on insufficient data.
The future of A/B testing in the age of emerging technologies
Looking ahead, I envision the future of A/B testing to be greatly influenced by AI and Machine Learning. These technologies can process vast amounts of data more efficiently, predict outcomes with higher accuracy, potentially automate parts of the testing process, and allow teams to take action (based on test results) with incredible speed. Imagine getting A/B test results back and implementing changes to an entire campaign on the same day. That level of speed will be essential as consumer behavior accelerates in parallel with the increasing speed of trend cycles.
Advice for early CRO adopters
My advice is to start small but think big. Focus on understanding your audience deeply and choose one or two key metrics you want to improve. Design your experiments around these metrics and learn from each test, whether it succeeds or fails. Remember, the goal of A/B testing is not just to increase short-term metrics but to gather insights that can drive long-term strategies and innovation. Be patient, be methodical, and let data guide your decisions.
When implementing dramatic design changes, we needed to make sure that we were not taking a step back, by regressing our conversions. So the new design needed to perform at least as well as the old design.
Most liked features of VWO
For us, the platform needed to be easy to integrate with. VWO had clear and concise documentation, and we didn’t have to go through sales calls to get on and running. Also, it had transparent pricing, and we could know how much we would need to pay once we confirmed that VWO suits our needs. They even had a trial so we could test the product and see if it fit our needs. We only used the redirect A/B tests, and the variant had the same URL as the control with a custom ab parameter. To track the A/B test results, we were using targeted page views and triggering VWO custom goals. Besides this, we were also checking the results via our internal analytics tool because VWO allowed us enough flexibility to use the product according to our needs.
Remember, the goal of A/B testing is not just to increase short-term metrics but to gather insights that can drive long-term strategies and innovation.