Use data journalism to create deep, meaningful content that stands out from your competitors.
Content Marketing | Sep 20
Michael Brown on August 22, 2017 • 7 minute read
If you’re like most content marketers, you cringed upon reading that quote.
That’s because you probably assume these two KPIs always reflect the engagement, quality and relevancy of your content. Your marketing instinct is telling you to get those numbers up!
In previous work lives, that would have been my reaction as well. But when I shared this stat with the nDash team a few months back, I was thrilled. We all were. Why? Because the decrease came with a two-fold increase in sign-up conversion rates to our content creation platform—just as we had hoped.
Months prior, as the sign-up rate slowed, our hypothesis was that we were over-explaining our service. Too much content on too many pages. In the early days of startups, where the founders want people to know everything about the product, this is often the case. So we decided to curtail our content (more on this below) with the hope that people spend less time on the site, and more time inside the platform, which they could only do by signing up.
We were right. After making the changes, we saw that visitors were no longer wandering from page to page, never to return. They were finding what they needed, making sense of it, taking action, and returning in greater numbers. To recap:
If I had held onto my assumption that these metrics always had to maintain a certain level, we would have spent our time adding more web pages, writing longer copy, rethinking our content strategy, testing new keywords, redesigning our homepage and so forth. Not only would those tasks have been colossal wastes of time (in this context) they would have prevented us from reaching our goals. The same is true for any false marketing assumption: it will divert, distract and detract from your success.
Instead, we achieved our goals by challenging our assumptions.
Maybe it’s time for your brand to do the same?
Like us, you’re probably drawing at least a few inaccurate conclusions from accurate data points. So in this post—using ourselves as examples—I’d like dive deeper into why we sometimes make these wrong assumptions, the damage they cause, and how to avoid them.
Before nDash, I worked in the marketing department of a large software company. We targeted enterprise IT departments for six-figure deals. Our 70+ inside sales reps were hungry for leads, so it was marketing’s job to generate them via whitepapers, demo requests, pricing inquiries and so forth. The more pages a visitor viewed, and the longer they stayed, the more likely they were to convert with one of our offers.
We were quite successful with this approach, and because of this, I carried this mindset with me to nDash. We got started using all the same tactics, measuring the same KPIs, and the numbers looked healthy. But it wasn’t effective, for a reason that now seems so obvious: nDash is a self-serve platform with no sales team, and with an average revenue per customer far less than six figures.
So we asked ourselves, “What’s the point of having visitors view more web pages, for longer periods of time, when we just want them to sign up? What’s the point of generating leads when it’s easier to sign up? What’s the point of generating leads without a sales team to follow up on them?”
Once we realized our assumptions were wrong—that a high rate of pages per session and session duration was a detriment in our business model—we acted accordingly:
We’re still seeing good results from this shift, but I often think we can and should go further. For instance, look at the first-time visitor experience of platforms like Zapier, UpWork and 99Designs. You’ll find that have comparatively few web pages, and almost all of them are designed to get you to become a user. Sorry, there isn’t a single whitepaper to be downloaded. I checked 😊.
The point here is that importance and relevancy of any data point can change depending on the business model. Don’t assume that what worked well at a previous company will work well at the next.
Now don’t get me wrong: Session duration and pages per session are incredibly valuable metrics to us, but only in specific instances, which leads to my next point…
In the beginning, we tracked metrics across all our web properties and assumed (wrongly) that they were telling the same story and carried equal importance. Huge mistake. When it comes to marketing data, context is everything. We soon discovered that the metrics told completely different stories based on the user experience, so we separated our marketing analytics into three buckets: website, blog and platform.
Here’s how session duration and pages per session differ from one to another:
It’s kind of embarrassing to look back and realize how misinformed we were on the data. Without accounting for the user experience, we could have tinkered all day and moved the needle in one area, but at the expense of another. If your brand offers a unique experience on multiple platforms, you’d be wise to consider the same.
You can use the Blog Quality Metrics dashboard to determine if your popular blog posts are actually leading to conversions.
Data-driven marketers love to conduct experiments and monitor the results. But no matter how ingenious and well-executed those experiments might be, if they do not coincide with a change in assumption every now and again, then the effectiveness of those experiments will be negated. Garbage in, garbage out.
As mentioned earlier, the key for us wasn’t about finding new ways to boost pages per session and session duration. The key was discovering that we were looking at those metrics the wrong way in the first place
I’ve only focused on two metrics in this post, but truthfully, there have been dozens which have led us to rethink some of our strongly held assumptions on what works, what doesn’t, and why. We’re discovering more every day, thanks to Databox.
As strong proponents of data-driven marketing, I’m curious to hear which metrics have led you down similar paths. Please share your experiences in the comment section below.
Content Marketing | Sep 20
Content Marketing | Sep 18