We asked a few dozen agencies to share their most painful client experiences, and more importantly, the advice they’d give to other agencies to avoid them.
Agencies | Nov 5
Kevin Kononenko on October 25, 2018 • 10 minute read
“Prior to Databox, we were inconsistent with tracking data performance. It would take a ton of time to develop and, in a lot of cases, it was very hacked together.
We were using Google Sheets, Keynote presentations and different types of PowerPoint decks. There was no consistency across accounts. We were reporting different numbers via different tactics. And then analyzing the data and producing those reports was a huge time suck.”
-Keith Moehring, VP of Growth at PR 20/20
Moehring: We position ourselves as a performance-driven agency. So every month we’re delivering monthly scorecards and we’re trying to analyze what the data is telling us.
Prior to Databox, we were inconsistent with tracking data performance. It would take a ton of time to develop and, in a lot of cases, it was very hacked together.
We were using Google Sheets, Keynote presentations and different types of PowerPoint decks. There was no consistency across accounts. We were reporting different numbers via different tactics. And then analyzing the data and producing those reports was a huge time suck.
We didn’t have the systems in place to efficiently produce those reports. From a management team perspective, we had very limited visibility into account performance and where teams were winning or where teams weren’t hitting the goals that we needed to hit. We needed a way to strategically step in and offer some guidance or some senior level support if necessary.
As an agency, we have always emphasized using data to guide strategy. In our world, data is intelligence. You take that intelligence and you put it into action and that action hopefully turns into an outcome.
On a month-to-month basis, we’re reviewing the data. We’re assessing performance overall and evolving strategies as needed. So with Databox, the whole process became that much more easy to manage. It’s less time intensive to put together those reports and create those visualizations. It allows us to more effectively communicate the data that we’re seeing with the client.
We consolidated everything from a number of different reporting interfaces into one. We cut down on the time it actually takes to create those reports and deliverables for clients. In some cases, we weren’t delivering monthly scorecards until the 15th of the month. At that point, it’s way too late to really be doing anything with that.
Now we’ve cut it down to like five days after the start of the month. We’ve given everybody greater visibility into the performance of the clients, accounts and campaigns so the account teams can easily see what’s going on. The clients have real-time visibility into all of it. It creates a level of transparency and accountability on our end. Our management team can check in whenever they need to to see how things are going. My favorite part is that it has actually created some consistency across the account. We’re not reinventing the wheel with every new client that comes in.
We’ll do weekly reporting internally to get a gauge on what’s going on. We’re not delivering full scorecards to clients week to week largely because our campaigns take some time to develop and launch. We have weekly calls with the clients and, as part of the agenda, we’ll touch on some of the performance metrics and the numbers that we’re seeing.
It’s right on our homepage. That’s the first thing you see: “performance-driven agency.” If we’re talking with a client, the first conversation we’re having as part of the business development process is about their goals. We’ll take a step back and ask them to give us access to Google Analytics, HubSpot, or any of the other tracking and reporting tools that they’re using. We want to see what the numbers say.
In a lot of cases, there’s some data cleanup that needs to happen, like internal traffic being filtered out, bot filters being turned on etc.
We work with a number of tech companies and they often have login buttons at the very top. So, anybody who’s coming to the site and clicking on that login button probably isn’t a marketing prospect for them. We need some way to get them out of the picture. Once we do that we can come back to them and say, “Here’s a realistic look at the numbers you’re seeing and here’s where we think we can go with it.”
Then, as part of that business development process, we ask them to go through our marketing score assessment. Essentially, it’s a series of questions across 10 core areas of the business. It helps us get a sense of where they’re at from a marketing perspective.
Before we sign any sort of contract, we generally agree on where they’re going, what their goals are and what the expectations are. I’d say probably 50 to 60 percent of clients have some sort of measurable goal in place when we kick off. If they don’t, they often require some level of foundation work to get that platform in place. They need help instituting and installing the tracking and performance technology to get that measurement in place.
In the case where there is no benchmark data, we actually hold off on setting any sort of client goals until we have some sort of benchmark data to base that off of. We don’t want to set ourselves up to fail based on client expectations and what may or may not be realistic.
It’s definitely a mix. Some have measurable goals that they’re looking to achieve. They can state those right up front. In most cases, though, they don’t have anything formally defined and we’re helping them to define that.
As part of the initial conversation we’ll ask, “In 12 months, how are you going to evaluate our performance in our partnership?” After that question, we can help them define their metric. And we’re showcasing that metric throughout the entire partnership. Whether that’s traffic, qualified traffic, marketing qualified leads, sales qualified leads or subscribers.
We’ll probably have an overall marketing goal of what we’re trying to achieve within the next quarter. As soon as a client comes on, we host a “marketing growth hackathon”. It’s an interactive planning session where we bring together key stakeholders like marketing, sales, customer service, and other leaders to solve a specific growth challenge. We’re trying to establish a model where we can experiment more efficiently and adapt consistently to accelerate success. These can last anywhere from an hour to a full day. We start off the hackathon by defining one SMART goal.
We really emphasize the fact that it’s got to be specific, it’s got to be measurable, it’s got to be attainable, it’s got to be actually relevant to the overall business goals. And then there’s gotta be some sort of time element to it. Typically it’s between one and two quarters, like 90 days. From there, we’ll take a look at kind of what assets they’ve gotten placed, milestones coming up, our target audience and what audiences can we tap into, like email, databases, social reach, etc. It probably takes about an hour or two to get through that whole process as a team and as a group, but at that point, everyone’s got a very nice focus on what we’re here to achieve and what we’re looking to do.
From there, the group starts brainstorming activities. Depending on the size, we may break it up into smaller groups. Team members throw out, “Well, what if we did this? What if we did that?”
We log all those and at the end of the day, what we’re trying to get out of this hackathon is a minimum of three ideas that we can activate in the next 30 days and then execute in the next 90. Three is often a really low number because we’ve walked away from a client hackathon with 50 to 60 campaign ideas, in some cases. We’ll take those ideas and then rate them based on:
And as you start writing those on a scale of one to five, every campaign gets two ratings: potential impact on the KPIs and ease of execution. We have a game plan for the next 30 days minimum, but often for the next six to nine months of all the campaigns that we can execute on. And everybody who is in that group is bought into the process. They all know how those ideas came up. So we already have buy-in from leadership or from sales. Now we’re off and running.
That’s where the scorecards and that’s where Databox really comes into play. Every month, we’ll deliver a monthly scorecard to the client. This is where we used to do it in Keynote or Google Sheets.
The scorecard includes high-level metrics like overall blog traffic blog performance. Then we’ll report on the campaigns we’re running. We will also deliver a written assessment that essentially tells the stories that the numbers and the dashboard are showing and offers some context. For example, “We published this blog post on this date and shared it here on LinkedIn and actually sent an email about it on this date and there was a huge spike in traffic.”
That’s where the Databox annotations really come in handy so we can start painting that picture. It takes less time for us to go back and fill in those details. As part of that assessment, we’ll call out some of the specific recommendations that the data is telling us.
There are two parts of an account team: the account manager and then there’s support. The support team essentially owns a project or a campaign. They’re in charge of making sure that it really hits the mark. And so they’re hopefully looking at the data and reporting back to the account manager. They can bring in senior-level support to offer some additional context or ideas and we may even hold little mini hackathons internally to brainstorm a bunch of ideas. From there, the account managers use reporting to keep the client up to date on a week to week basis. Then we will do the full data dump every month.
Want to get a firsthand look at how agencies like PR 20/20 use Databox to automate their client reports, set client goals, and much more? Sign up for one of our live agency demos.
We host them live every day.
Agencies | Nov 5
Agencies | Nov 2
Agencies | Oct 26