We asked 133 marketers to share the tools and processes they use to conduct content audits, then compiled the responses into this 10-point checklist.
Management | Nov 11
Databox on December 21, 2015 (last modified on March 11, 2016) • 7 minute read
Conventional wisdom says the data always wins. People have biases; numbers are neutral. Ignore the facts and you’ll sink your business, one delusional decision at a time.
Personally, I’m more #teamdata than #teamgut (partly because “teamgut” sounds like a parasitic infection). I like numbers. I not-so-secretly think Excel is fun. I’ve built forecast models for more than one company. I update dashboards to relax. I do basic math calculations in my head to distract myself on the treadmill.
You get the picture.
But at the end of every data point, there’s a human being that has either tracked, interpreted, or selected the metric you’re relying on to make your point. And that’s where it gets complicated. Because the truth is, the numbers are not always right. Sometimes the picture the data offers is incomplete, incorrect, or ultimately short-sighted. The key is knowing when you need to pay attention to something — and when you need to dig further, or ignore it altogether.
I remember when my team ran a test a couple years ago around the call to action on our website. The test version produced far more clicks than our control, and we were ready to celebrate a win. But when we looked at how many people actually started a trial with us after clicking through our CTA, we found we got better results with the control. The initial results were misleading; it wasn’t until we looked deeper that we got the full picture.
Lesson learned: If you’re using data to make a call, make sure it directly supports your desired outcome. Don’t make assumptions.
And of course, there are times when the data is just plain wrong. I recall an incident 8 years ago where I was ready to completely change the way we processed subscription renewals at my company, based on data that clearly showed our method wasn’t as effective as we thought. Fortunately, the “facts” didn’t ring true with the VP of engineering, and he uncovered an issue with the reporting at the last minute that saved me from making a costly mistake.
Lesson learned: Always make sure you know where your data comes from. How it’s collected will have a dramatic impact on accuracy.
Then there’s the data-driven yet short-sighted trap. Take this perspective, from a 2001 article in Bloomberg titled Sorry, Steve: Here’s Why Apple Stores Won’t Work:
Given the decision to set up shop in high-rent districts in Manhattan, Boston, Chicago, and Jobs’s hometown of Palo Alto, Calif., the leases for Apple’s stores could cost $1.2 million a year each, says David A. Goldstein, president of researcher Channel Marketing Corp. Since PC retailing gross margins are normally 10% or less, Apple would have to sell $12 million a year per store to pay for the space. Gateway does about $8 million annually at each of its Country Stores. Then there’s the cost of construction, hiring experienced staff. “I give them two years before they’re turning out the lights on a very painful and expensive mistake,” says Goldstein.
Instead of going out of business in two years, Apple Stores are leading all US retail stores in sales per square foot. So much for Goldstein’s predictions.
Lesson learned: To quote a great blog post from Intercom, “sometimes you need to be willing to say ‘we believe in this, fuck what the data says’.”
Let’s be clear: I’m not advocating that you ignore data. I’m just suggesting that it should be one input into your decision-making process, and that you look at intuition as an equally viable input. That doesn’t mean “because I want to” or “because it’s easier” are valid reasons to make a business decision. Don’t confuse arrogance or laziness with intuition. But when it’s genuine, intuition can be more powerful than analytical thinking.
That’s because the type of intuition that drives business success isn’t completely random. Experience is typically touted as a key ingredient, and I won’t argue that experience plays a role. But I believe there are two characteristics that are even more critical to developing powerful (and successful) intuition as a business leader:
Steve Jobs launched the Apple Store because he was disgusted with the retail experience for buying a computer. He made a call that flew in the face of empirical wisdom because he understood what his customers were going through. He didn’t need surveys or consultants to convince him the status quo was flawed. He knew it, so he resolved to make it better and had the strength of will to follow through.
Of course, experience can make that type of market understanding and tenacity easier to come by – but it’s not enough to simply have been doing something for a long period of time. You really need to cultivate empathy for your customers and build (or rely on) nerves of steel in order to make successful business decisions based on intuition. That resolve is incredibly important – because it’s the only thing that will motivate you to stay the course when your decision contradicts cold, hard data.
You may believe in the power of intuition but still be reluctant to apply it to business decisions. Relying on data feels so much easier than trusting your gut. After all, making a call based on instinct requires bravery… or is it the aforementioned delusion? It’s an act of confidence… or hubris? You’re going against the grain – or wait, maybe you’re simply an egotistical maniac?!?!
Going back and forth between those thoughts is not an easy dynamic to deal with. The bottom line is, you’re either going to be right or you’re doing something incredibly stupid that you’ll later regret because everyone told you it was wrong.
Yet if you’re in management, your job comes down to making decisions. The numbers may give you a false sense of security, but when it comes down to it constantly delegating the decision-making to the data means you’ll miss opportunities to do great things.
I’ll leave you with one more personal anecdote: in 2014, I was getting ready to launch the first display advertising campaign my company had ever run. My design team delivered banner creative that broke every single best practice on the books: it used simple text instead of visuals, didn’t have our logo or company name on every frame, and rotated extremely slowly. I had more than one advertising rep go out of their way to tell me they thought the campaign would fail. They sent me anonymous stats of other campaigns that had bombed because of similar “issues” with their creative.
I’m not proud of this, but I freaked out. All signs pointed to this ending in disaster, and I worried that if the results were abysmal any support for future campaigns would go out the window.
So I ended up having a heated debate with my creative director. He talked me off the ledge (though he probably wanted to push me off it) by pointing out that if every other advertiser was adhering to the same set of “best practices,” our campaign would stand out. I had worked with this creative director & his insanely talented design team for more than a year, and I knew they understood and cared deeply about our audience. Running this creative may have been risky, but I had every reason in the world to trust them. So I took a deep breath, disregarded the well-meaning advice and launched our ads.
And in the end they far exceeded the average performance for every single placement we ran. In fact, those damn banners did better than any display campaign I’d ever managed or read about.
Lesson learned: Sometimes delivering better results means ignoring the data.
Management | Nov 11
Management | Aug 1 2018