KPIs That Move the Business: A Marketer’s Shortlist
The most important KPIs in marketing are the ones that connect directly to business outcomes: revenue, margin, customer acquisition cost, lifetime value, and conversion rate. Everything else is supporting data. The problem is that most marketing dashboards are built the other way around, piling up activity metrics until the signal gets buried in noise.
After two decades running agencies and managing performance marketing across 30 industries, I’ve seen the same pattern repeat itself. Teams track what’s easy to measure, then work backwards to justify why it matters. The result is a reporting culture that looks rigorous but rarely drives decisions.
Key Takeaways
- Most marketing dashboards measure activity, not outcomes. The fix starts with choosing KPIs that connect to revenue, not just reach.
- Customer acquisition cost and lifetime value belong together. One without the other tells you almost nothing useful.
- Conversion rate is the most underrated KPI in most marketing stacks because it forces you to look at what happens after the click.
- Vanity metrics are not harmless. They actively distort decision-making by creating the impression of progress where none exists.
- A short, honest dashboard with five well-chosen KPIs outperforms a bloated one with twenty metrics nobody acts on.
In This Article
- Why Most Marketing KPI Lists Are Wrong Before They Start
- The Core KPIs That Belong in Every Marketing Stack
- The Vanity Metrics That Crowd Out the Useful Ones
- Channel-Specific KPIs Worth Tracking
- How to Build a KPI Framework That People Actually Use
- The Measurement Traps That Catch Even Experienced Teams
- What Good KPI Reporting Looks Like in Practice
Why Most Marketing KPI Lists Are Wrong Before They Start
When I joined iProspect as Managing Director, one of the first things I did was look at how we were reporting to clients. The dashboards were impressive to look at. Lots of numbers, lots of colour, plenty of movement week on week. But when I asked the account teams a simple question, “which of these numbers would you act on if they changed by 20%?”, most of them couldn’t answer quickly. Some couldn’t answer at all.
That’s the test. If a metric changes significantly and nobody knows what to do about it, it’s not a KPI. It’s a data point. There’s nothing wrong with data points, but they shouldn’t be sitting at the top of your reporting stack pretending to be strategic.
The distinction matters because KPIs shape behaviour. If your team is measured on impressions, they’ll optimise for impressions. If they’re measured on cost per acquisition, they’ll think differently about every channel decision. The metrics you choose are, in effect, a management tool. Choose them carelessly and you’ll get careless outcomes.
If you want a broader grounding in how to build measurement frameworks that actually hold up, the Marketing Analytics hub covers everything from attribution to GA4 to the tools worth knowing about.
The Core KPIs That Belong in Every Marketing Stack
There is no universal list that works for every business. Anyone who tells you otherwise is selling you a template, not thinking about your situation. But there is a set of metrics that appear consistently in high-performing marketing operations, and for good reason. They’re the ones that connect marketing activity to commercial reality.
Customer Acquisition Cost
Customer acquisition cost (CAC) is the total spend required to acquire one new customer. That means all marketing costs, not just paid media. When I was turning around a loss-making agency, one of the first things I noticed was that the business had no idea what it cost to win a new client. It tracked pitching costs loosely, but salary time, proposal production, and leadership involvement were invisible. The real CAC was roughly three times what anyone had estimated. No wonder the margins were broken.
CAC on its own is incomplete. You need to know what you’re getting in return for that acquisition spend, which is why it should always be read alongside lifetime value.
Customer Lifetime Value
Customer lifetime value (CLV or LTV) tells you how much revenue a customer generates over the full course of their relationship with the business. The ratio between LTV and CAC is one of the most useful single numbers in marketing. A healthy LTV:CAC ratio varies by sector, but if you’re spending more to acquire a customer than they’ll ever return in margin, the business model has a problem that no amount of creative excellence will fix.
Most marketing teams underinvest in understanding LTV because it requires joining up data from marketing, sales, and finance. That cross-functional work is harder than pulling a channel report. But it’s also where the real strategic insight lives.
Conversion Rate
Conversion rate is the percentage of visitors, leads, or prospects who take a defined desired action. It’s one of the most revealing metrics in any stack because it forces you to look at what happens after the click, not just before it. I’ve sat in too many channel reviews where teams celebrated strong click-through rates while ignoring the fact that the landing page was converting at under 1%. The traffic was fine. The funnel was broken.
Conversion rate should be tracked at multiple stages: from visit to lead, from lead to qualified lead, from qualified lead to customer. Each stage tells you something different about where the friction is. HubSpot’s breakdown of marketing analytics versus web analytics is worth reading if you want a clearer framework for thinking about funnel measurement versus traffic measurement.
Return on Ad Spend
Return on ad spend (ROAS) measures the revenue generated for every pound or dollar spent on advertising. It’s a useful efficiency metric for paid channels, but it gets misused constantly. A high ROAS on a brand campaign that’s mostly retargeting existing customers looks impressive but tells you very little about whether you’re actually growing the business. ROAS needs context: which audiences, which funnel stage, which attribution window.
When I was managing hundreds of millions in ad spend across performance channels, one of the things I pushed hard on was disaggregating ROAS by new customer versus returning customer. The blended number was almost always flattering in a way that obscured real acquisition performance. Separating the two changed how budgets were allocated significantly.
Revenue Attribution by Channel
Understanding which channels are contributing to revenue, and how much, is foundational to any serious marketing operation. The challenge is that attribution is genuinely hard, and most attribution models make assumptions that distort the picture in different ways. Last-click attribution overvalues bottom-of-funnel channels. First-click overvalues awareness. Data-driven attribution is better but still imperfect.
The goal isn’t perfect attribution. It’s honest approximation. You need a model that’s directionally correct and consistently applied, so you can make budget decisions with reasonable confidence. Moz’s Whiteboard Friday on GA4 directional reporting is a useful watch if you’re thinking about how to use GA4 for this kind of channel-level analysis without over-indexing on precision.
The Vanity Metrics That Crowd Out the Useful Ones
Impressions, reach, follower count, page views, social likes. These metrics are not worthless, but they are frequently treated as proxies for business performance when they’re nothing of the sort. The problem isn’t that teams track them. The problem is when they appear at the top of the board report as evidence that marketing is working.
I judged the Effie Awards for several years. The Effies are specifically about marketing effectiveness, so the entries are supposed to demonstrate real business impact. Even there, you’d see submissions that led with reach and awareness numbers, then struggled to connect them to anything measurable at the commercial level. If the industry’s most rigorous effectiveness competition has this problem, it’s everywhere.
Vanity metrics persist because they’re easy to produce, they tend to go up over time, and they look good in presentations. None of those are reasons to let them define your marketing performance. The discipline is in being honest about what each metric actually tells you, and what it doesn’t.
Channel-Specific KPIs Worth Tracking
Beyond the core commercial metrics, there are channel-level KPIs that provide useful diagnostic information. The key word is diagnostic. These metrics help you understand what’s happening inside a channel so you can improve it. They’re not the same as the business-level KPIs that tell you whether marketing is working overall.
Email Marketing
For email, the metrics that matter most are open rate, click-to-open rate, conversion rate, and unsubscribe rate. Open rate tells you whether your subject lines and sender reputation are working. Click-to-open rate tells you whether the content is relevant to the people who opened. Conversion rate tells you whether the email is driving action. Unsubscribe rate is an early warning signal for list fatigue or relevance problems. HubSpot’s guide to email marketing reporting covers how to build a sensible reporting framework around these metrics without getting lost in the noise. Mailchimp’s overview of marketing metrics also provides useful benchmarks if you want to sense-check your performance against industry averages.
SEO and Organic Search
For organic search, the metrics that matter are organic sessions, keyword rankings for commercially relevant terms, click-through rate from search results, and organic conversion rate. Rankings alone are insufficient. A page ranking first for a term that nobody searches, or that attracts traffic with no commercial intent, contributes nothing to the business. Semrush’s guide to using Google Analytics for SEO is a solid resource for connecting GA4 data to organic search performance in a way that actually informs decisions.
Paid Media
For paid channels, cost per click and click-through rate are useful operational metrics, but they should always be read in the context of what happens downstream. Cost per acquisition and ROAS are the metrics that connect paid activity to commercial outcomes. Quality score in Google Ads is worth monitoring as a proxy for ad relevance and landing page experience, since it affects both performance and cost.
Social Media
Social media measurement is where the vanity metric problem is most acute. Engagement rate, reach, and follower growth are the metrics most social teams lead with, but they’re only meaningful if you can connect them to something further down the funnel. For brands where social is a genuine demand generation channel, cost per lead and conversion rate from social traffic are more useful than any engagement metric. Tools like Sprout Social’s Tableau integration are worth knowing about if you need to pull social data into a broader reporting environment alongside other channels.
How to Build a KPI Framework That People Actually Use
A KPI framework only works if it shapes decisions. That sounds obvious, but most frameworks are built to satisfy reporting requirements rather than to drive action. The result is a dashboard that gets produced every month, reviewed briefly, and then filed without changing anything.
When I was growing a team from 20 to 100 people, one of the things I learned quickly was that more metrics don’t create more clarity. They create more noise. The teams that performed best were the ones with a short list of metrics they genuinely understood and could influence. The teams that struggled were often the ones drowning in data they didn’t know how to interpret.
A workable KPI framework has three layers. The first is business-level KPIs: the commercial outcomes that marketing is in the end accountable for. The second is channel-level KPIs: the diagnostic metrics that tell you how individual channels are performing. The third is operational metrics: the day-to-day numbers that help teams manage campaigns and spot problems early. Each layer serves a different purpose and should be reported to different audiences at different frequencies.
The critical thinking question to ask about any metric before it goes on a dashboard is: if this number changes by 20% next month, what decision would we make? If you can’t answer that, the metric doesn’t belong in a KPI framework. It might still be worth monitoring, but it shouldn’t be driving the conversation.
The Measurement Traps That Catch Even Experienced Teams
The first trap is mistaking correlation for causation. A channel shows strong ROAS in the same quarter that the business runs a major above-the-line campaign. The paid channel looks like it’s performing brilliantly, but the real driver might be the brand awareness that was built above it. Disaggregating these effects is genuinely difficult, and most teams don’t do it rigorously enough.
The second trap is optimising for the metric rather than the outcome. If a team is measured on cost per lead, they’ll find ways to reduce cost per lead. Sometimes that means better targeting and more relevant messaging. Sometimes it means loosening the definition of a lead until the numbers look good but the sales team is drowning in unqualified enquiries. The metric moved in the right direction. The business outcome didn’t.
The third trap is treating the reporting period as the unit of analysis. Monthly reporting creates pressure to show month-on-month improvement, which often leads to short-term optimisation at the expense of longer-term brand building. Some of the most important marketing investments take six to twelve months to show up in commercial metrics. A KPI framework that only looks backwards thirty days will consistently undervalue them.
The fourth trap is dashboard proliferation. I’ve worked with businesses that had four or five different dashboards pulling from different data sources, each telling a slightly different story. Nobody knew which one to trust. The fix isn’t a better dashboard. It’s agreeing on a single source of truth and being disciplined about what goes into it. Moz’s overview of GA4 and Moz Pro integration is a useful example of how to connect tools without creating reporting fragmentation.
If you’re building or rebuilding your measurement approach from the ground up, the broader Marketing Analytics section on The Marketing Juice covers the full range of topics, from attribution models to GA4 implementation to the tools that are actually worth your time.
What Good KPI Reporting Looks Like in Practice
The best marketing reports I’ve seen have a few things in common. They’re short. They lead with the commercial metrics. They explain what changed and why, not just what the numbers were. And they end with a clear view of what the team is going to do differently as a result.
The worst reports are the opposite: long, channel-by-channel, heavy on numbers and light on interpretation, with no clear point of view on what the data means or what should happen next. They’re produced to demonstrate activity, not to drive decisions. That’s a cultural problem as much as a technical one, and it won’t be fixed by a better analytics tool.
The discipline of good KPI reporting is really the discipline of critical thinking applied to data. It means asking whether the numbers you’re looking at are actually telling you what you think they’re telling you. It means being willing to say “I don’t know” when the data is ambiguous rather than constructing a narrative that fits. And it means being honest about what your measurement framework can and can’t capture, because no framework captures everything.
If I had to teach one thing to every junior marketer in their first month, it wouldn’t be how to use GA4 or how to build a dashboard. It would be how to look at a number and ask the right questions about it. That skill compounds over a career in a way that platform knowledge doesn’t.
About the Author
Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.
