Although analytics and reporting may sound the same, they aren’t. Dive into the difference between the two to learn which one you need.
Reporting | Sep 24
Melissa King on August 18, 2021 (last modified on August 16, 2021) • 11 minute read
If you make mistakes when analyzing your marketing data, you’re not alone.
We surveyed marketing data professionals about their past mistakes, and more than 85% reported that they’ve been unsuccessful with analysis in the past.
So, if you’ve made a bad guess or didn’t know what to do with your data, think of it as an opportunity to improve. With some expert guidance, you’ll get on the right track.
And that’s why we’re here today. This blog post will cover some best practices for doing your best data analysis work and common mistakes to avoid so you can move forward in confidence.
Let’s dig in.
If you want to learn how to avoid data analysis mistakes, it helps to understand some of the most common pitfalls. Before we asked marketers about the mistakes they advise avoiding, we identified four of our own: Waiting too long to act, striving for perfection instead of action, not asking questions, and failing to make actionable conclusions.
It turned out that respondents could relate to these pitfalls. At least 20% of them related to one, with the most common mistake being striving for perfection instead of action (39.29%).
So, if these four mistakes are common in data analysis, you can start the path to better habits with these proactive strategies:
It takes practice and first-hand experience to develop these kinds of habits, so don’t feel discouraged if they don’t come naturally to you. Always keep trying, and remember that every little insight helps.
The data professionals and marketers we consulted mentioned eight analysis mistakes they try to avoid:
Your data analysis can’t help you if you don’t act on it. Unfortunately, many professionals who analyze data take too long to take action, reducing their analysis’s impact. Why do so many data professionals hesitate?
According to Milkwhale’s Andre Oentoro, this pitfall happens to marketers because they wait for data to change on its own. “Marketers tend to rely on data and change — hoping that bad results will turn into good results. However, if you keep waiting for those weak numbers to turn into better ones, it can take forever. In the end, you would have achieved little to nothing because you wasted time waiting for your data to change without actually making any changes,” Oentoro explains.
Miranda Yan from VinPit thinks the issue comes from the urge to collect lots of data to analyze at once. Yan says, “As many marketers keep on collecting data for a long time to make the sample big and then plan to analyze them for perfect action. But the data collection usually takes much more time which doesn’t allow them to go for the right action at the right time. Also, many marketers are usually confused in making the sample size and try to make it as big as possible whereas it should be product dependent & shouldn’t be too big or too small.”
Zeotap’s Zulay Regalado attributes the problem to a resource gap, telling us, “Many marketers are ‘data-rich and insight poor, meaning they struggle with the gap between having customer data and being able to act on it. Getting access to data science resources to bridge this gap is critical to executing data-driven marketing… Marketing success hinges more and more on an ability to use data to not just target, but model.”
So, what can you do to make sure you act on your data on time? Since the answer depends on your product and market, consult marketers in your industry with more data experience. They’ll help you set sample size expectations and share resources for acting on your data.
If you’re worried about making too drastic of a change too soon, remember that you always have A/B testing as an option. A/B testing involves one variable at a time, so you won’t have to worry about changing too much at once.
As marketers, it’s easy for us to fall back on data and forget that marketing has a human element. All of your data comes from people, so you should view it in a human context.
Greg Gillman of Mutesix urges you to listen to your gut when necessary: “you don’t want to fall into the trap of continually interpreting data while never using your gut instincts. Whether you’ve done this for a while or are just feeling especially innovative, there comes a time when you may be tempted to take an educated risk in a particular situation. Analyze your collected data, but then bring in a little human emotion and instinct. You may just find that you have the best of both worlds.”
According to Brent Sirvio from Impressa Solutions, you should also remember that your data is only one part of the full story. “Numbers only tell part of a story, and they’re the part of the story that can be most easily manipulated. KPIs and benchmarks are useful, but they can also allow companies to paper over strategic or logistical issues. Metrics should be aligned against deliverables and team execution to tell a story that not only has depth but reflects the reality that numbers often will lie when devoid of context,” Sirvio advises.
If you want to get results from your data, you’ll need to learn how to identify your key performance indicators (KPIs). KPIs are the metrics that indicate your progress toward your business goals.
For Amplitude Digital’s Jeff Ferguson, one of the biggest data analysis mistakes is “Not knowing the difference between a real key performance indicator (KPI) and a diagnostic metric.”
Ferguson elaborates, “I see it all the time – bloated dashboards filled with metrics that do nothing but confuse bosses and clients by obscuring what they really want to know: Am I making any money from this campaign and how much is it costing me? Look, tracking clicks, CPC, and all the rest are important, but they are rarely the end game. Follow the money.”
How do you develop KPIs that matter? Some of the tips experts shared with us include using a SMART goal framework, consulting your team members, and matching one KPI to one goal.
Editor’s note: A KPI dashboard can help you stay laser-focused on the metrics that truly matter for your success. Try using a template like the HubSpot (KPI Trends) Dashboard Template for Databox to prioritize the right info.
Does the data in your analysis match your audience and business goals? If you don’t pay attention to your sample, you could fall into the trap of sampling bias.
As Because Market’s Heidi Robinson puts it, “It’s really important to pick a diverse group to study in data analysis and verify these things beforehand. It’ll save you a lot of potential headaches later and ensure you don’t have any bias in your data later.”
It’s important to point out that this diversity should align with your marketing goals. In marketing, your data relevance affects its quality. For example, if your sample includes data from a group you don’t want to market to, don’t be afraid to filter it out.
As the saying goes, correlation is not causation. But, it can be easy to fall into the trap of seeing similar data trends and assuming they’re related.
“What happens is we see certain data changes after a small change, but we cannot simply assume causation if two events are occurring simultaneously. Our observations would be purely anecdotal. Second, there are so many other possibilities for why such an association is happening.” says Custify’s Victor Antiu.
Antiu shares an example: “For example, you just launched a new feature and assume churn is linked to that missing feature. After 3 months, you look at gross data and see MRR churn is down 20%. You can assume the new feature helped, but if you look at its adoption, you notice only 10% of your users have activated it. If you split them into cohorts, you see churn is indeed down in the “new feature” segment, but not enough to justify the global improvement.”
As you can see in Antiu’s example, looking at contextual data can help you determine if you have an actual connection. Leave no data point unturned before you draw any assumptions.
Editor’s note: One way to get more context for your data is to look at similar stats across platforms. Check out how the Awareness Databoard for Databox examines brand awareness on three different apps.
Not all data findings are built the same. Don’t get tunnel vision focusing on a specific metric, but make sure to watch out for unexpected insights.
For Tomas Sugar of Smartlook, a common data analysis pitfall is “the over-privileging of significance. This is when the marketing team or data analyst becomes fixated on a statistically significant result and spends all their time trying to find more and more ways to confirm it. Or the opposite may happen – where they downplay interesting, but not statistically significant findings, which could lead an entire team down a false path. Either way, this can have dangerous consequences for the business and its goals.”
Remember to set clear marketing goals and KPIs before digging into your analytics data. These priorities will help you decide which statistics are worth pursuing and which are red herrings.
We discussed the issues with waiting too long to act on data earlier. But, you could also set back your data analysis by not waiting long enough to get enough data.
“A common data analysis trap marketers often do is to make decisions with incomplete or very little data. If you launch a marketing campaign, a new feature, or a new A/B test, you need to gather enough data before analyzing it and acting on it,” Jonathan Aufray from Growth Hackers says.
Aufray continues, “I see too many marketers and entrepreneurs that pivot twice a week (Yes, you’ve heard it well) because they try something new, and because after a couple of days it didn’t bring the expected results, they change strategy or work on something else. It’s key to focus well and gather enough data before making any decisions.”
The best frequency for checking marketing data depends on the type of data. This subject could fill up a blog post on its own, but we recommend checking out HubSpot’s guide on the subject.
Data analysis doesn’t stop with you or your department. You need feedback from everyone affected by your data to get comprehensive insights from it.
Brad Wright of Greenspeed explains, “The most common trap is to think that analysis stops once you have a report in front of you. I think that most marketers are trained to produce reports and then they get excited about the results of the analysis and want to tell their bosses, but data analysis is never complete until the last person who needs it has had an opportunity to digest it.”
We recommend thoroughly preparing for your reporting meetings to get the best results from your data report. Go over your data multiple times so you know you have all of the conclusions on your side, and make your report as clear as possible for others to understand.
Reporting | Sep 24
Reporting | Sep 23