The Art of Being Stupid – Why Testing Matters More Than Everything Else
Note: This is a guest article written by Tyler Hakes, the strategy director at Optimist, a full-service content marketing agency. He’s spent nearly 10 years helping agencies, startups, and corporate clients achieve sustainable growth through strategic content marketing and SEO. Any and all opinions expressed in the post are Tyler’s.
Almost 10 years ago, I got my first job in marketing.
I was right out of college, and I was eager to prove myself and light the world on fire.
Like most people in their early 20s, I was convinced that I knew everything. I thought I had all of the solutions to every problem. I was a marketing mastermind, of course, because I had managed to get a few hundred people to follow me on Twitter.
It didn’t take me long to learn that I didn’t quite have all of the answers. In fact, I had a lot to learn. And it became more important for me to understand what I don’t know and to learn rather than to feel like I already had the answers.
Since then, I’ve worked for agencies, corporations, and startups. As a freelancer and agency owner, I’ve done marketing for every kind of company imaginable—from custom hats to apartment rentals. I’ve put together dozens of content marketing strategies and written/published thousands of articles, ebooks, and landing pages.
In all that time, I’ve come to realize something really, really important.
I don’t know anything.
Sure, I have accumulated a lot of knowledge and skills in the digital marketing space. I understand, at a high level, how things work. And I know, directionally, what the best practices are for achieving results.
But when it comes to executing any particular tactic, writing a particular type of content, or advertising to a particular market, each scenario is a little different. What I think will work best is usually wrong.
With this realization in mind, I’ve developed a kind of manifesto. It’s a way to remind myself that it’s okay to not have all the answers. It’s okay to be wrong, as long as you commit to finding the right answer eventually. Embrace a testing mentality.
Assume You’re Wrong
The biggest challenge with having a testing mentality is accepting that you are almost always wrong.
Let me say this again: You’re wrong.
It can be difficult to swallow. But don’t take it personally. Don’t link your personal worth to your ability to guess which messaging will get the most clicks or which blog post will drive the most social engagement. That’s just silly.
This isn’t Mad Men. You’re not Don Draper. So, don’t spend a million bucks trying to come up with the best idea. We live in a digital age of data. We’re able to track, measure, and test anything and everything that we do in business. There should be no more guesswork.
And what we generally consider to be “conventional wisdom” about best practices when it comes to optimization is also generally wrong. (That’s why it’s called “conventional wisdom,” after all.)
So, just assume that whatever you think is “best” is probably wrong and that you’ll need to validate any idea you have against cold, hard data.
Rather than fight this, I’ve come to embrace it.
It’s become a driving force for my work and my business. I assume that I know nothing and that everything—anything—is open for testing. Test, fail and learn. In that order.
And instead of taking it personally, I just accept that it’s impossible for someone to know the right answer 100% of the time.
As such, it makes way more sense to defer to the data whenever possible.
Unfortunately, you can’t possibly test every single variable to determine the single best approach, messaging, targeting, or design.
But you can get a head start.
Begin any testing cycle by looking at companies that test and optimize regularly. Then, steal their findings. Rather than starting from square one, begin your own testing with their current best case—the design, ad, or content that they’ve found to be most successful.
You can do this in a number of ways.
- Look at crowd-sourced A/B or multivariate test communities like Behave.org.
- Find and read success stories on testing outcomes.
- Visit competitors websites and emulate what they’ve done.
- Use social media to uncover specific messaging/positioning/CTAs used by competitors.
For our work on content marketing, we begin any client engagement with an extensive research and competitive analysis process. It’s the foundation of our content marketing strategy—is what we already know working for competitors and other companies in the space?
We’re able to gain years (or decades) or knowledge in a matter of weeks. We avoid expensive, time-consuming, and frustrating trial and error by just stealing what works and iterating on it from there.
Prove Yourself Right (Or Wrong)
Once you have learned to not internalize the results and found a base to start with, it’s time to test.
Depending on what it is you’re testing, you’ll want to generate dozens—or hundreds—of variations. Try different colors, placements, layouts, or strategies.
Of course, a tool like VWO will help you execute these tests quickly and measure the results.
Create an experiment sheet that allows you to track each experiment and the outcome of that experiment. Remember to constantly challenge your own assumptions.assume you’re wrong and that you can come up with a variation that works better.
This kind of data-driven testing mentality applies not only to tactical tweaks or changes. You can assume a similar mentality for your entire strategy.
When we work with a new client on content marketing, we make a whole bunch of new assumptions.
Each piece of content that we create serves a strategic purpose within our larger framework. Because of this, we have a specific goal for that piece—to generate search traffic, to earn links, to generate social shares, and so on. And this is the benchmark that we use to measure our effectiveness.
So, we may begin with an idea about which kinds of content will best accomplish those goals.
But, in most cases, we have never created content in this particular market. We have never tried to build relationships within this particular community. We’re just guessing (per our past experience with other clients and other industries).
This means that what we really want to do is try what we think we will work, get the results, and then incorporate that data to help us improve in the future. A lot of times, we’re wrong. If we didn’t adopt a testing mentality, then we would just carry on being wrong.
Obviously, this is not ideal. It’s better to be wrong and to learn from that mistake than to be blind to your mistakes. This is why we apply a testing model to everything from our overall strategy to specific, tactical implementation—content flow, calls to action, outreach emails, and so on.
We want to achieve the best results we can, even if it means that we admit we were wrong.
Do It All Over Again
Think you’ve found the right answer? You’re probably wrong—again.
Any test is only as good as the variations that you’re considering. So, while you may have identified a clear winner of those that you’re considering, that doesn’t mean that you’ve objectively identified the best possible solution.
Whatever is working best now could only work half as well as the true best case. And it’s just a matter of time until you hit that particular variation.
It’s the pursuit of continuous improvement. It’s relentless.
This is the foundational idea behind “growth hacking,” which is really just a data-driven, experimental approach to growth. It takes trial and error—over and over again—ad infinitum.
It’s why many software teams have embraced agile development because it allows for iterative progress and improvement rather than investing all of your time and resources into a single window or opportunity.
Testing isn’t just about making small tweaks. It’s about embracing a culture of continuous learning and improvement. It’s about the pursuit of truth, even when it makes you feel stupid.
And it all starts by admitting that you don’t have all the answers.