Measure Everything. Most of It Won’t Matter. Do It Anyway.
Value-oriented marketers measure constantly because measurement is how you separate activity from impact. Without it, you are spending money on faith, attributing outcomes to the last thing that touched a conversion, and optimising for metrics that feel productive but move nothing that matters to the business.
The discipline is not about dashboards or data stacks. It is about the habit of asking whether what you are doing is working, and being honest enough to act on the answer even when it is uncomfortable.
Key Takeaways
- Constant measurement is a habit and a mindset, not a technology investment. The tools are secondary to the questions you ask of them.
- Most marketing activity, if measured honestly against business outcomes, would show weaker results than the team believes. That is not a reason to stop measuring. It is the reason to start.
- Lower-funnel performance metrics are often measuring captured intent, not created demand. Optimising for them exclusively is a slow way to stop growing.
- The goal of measurement is honest approximation, not perfect attribution. Waiting for perfect data is a way of avoiding accountability.
- Marketers who measure consistently make better budget decisions, earn more internal credibility, and are harder to cut when times get difficult.
In This Article
- Why Measurement Is a Commercial Habit, Not a Reporting Function
- What Most Marketers Are Actually Measuring
- The Lower-Funnel Trap
- What Constant Measurement Actually Looks Like in Practice
- The Honest Approximation Principle
- Why Value-Oriented Marketers Measure More Than Their Peers
- If You Fixed Measurement, Most of Marketing Would Fix Itself
Why Measurement Is a Commercial Habit, Not a Reporting Function
There is a version of measurement that exists purely for reporting purposes. Someone builds a dashboard, it gets presented in a monthly meeting, and then nobody changes anything. I have sat in those meetings. I have run agencies where those meetings happened. The data was there, the slides were clean, and nothing moved as a result because nobody was asking the hard question underneath the numbers: is this actually working?
Value-oriented measurement is different. It starts from the position that marketing exists to drive business outcomes, and that every pound or dollar spent should be defensible against that standard. That does not mean every campaign needs a perfect ROI calculation. It means you are consistently asking whether your activity is contributing to growth, and whether you have any credible evidence that it is.
When I was building out the analytics function at a performance agency, the instinct was always to measure what was easy to measure. Click-through rates, cost per click, conversion rates at the bottom of the funnel. Those numbers were clean, fast, and they made the team look good. What they did not tell you was whether the business was actually growing, or whether you were just getting better at capturing the customers who were already going to convert anyway.
If you want a broader grounding in what marketing analytics actually covers and why the discipline matters beyond campaign reporting, the Marketing Analytics and GA4 hub on this site covers the full landscape.
What Most Marketers Are Actually Measuring
The honest answer is: activity. Impressions, clicks, opens, sessions, bounce rates. These are not meaningless numbers, but they are not business outcomes either. They are leading indicators at best, and vanity metrics at worst.
The problem is not that marketers measure these things. The problem is that measuring them becomes a substitute for measuring what matters. A campaign that drives 200,000 impressions and a 4% click-through rate sounds impressive in a slide deck. But if none of those clicks converted, if the traffic was irrelevant to the buying audience, or if the product had a supply problem that quarter anyway, the campaign contributed nothing to the business. You would never know that from the activity metrics.
HubSpot has written clearly about why marketing analytics is not the same as web analytics, and the distinction is worth understanding. Web analytics tells you what happened on your site. Marketing analytics tells you whether your marketing is working. Those are different questions, and confusing them is one of the most common reasons measurement programmes fail to produce useful insight.
A KPI report should be anchored to outcomes, not outputs. Semrush’s breakdown of KPI reporting is a useful reference for thinking about how to structure this, particularly the distinction between metrics that measure effort and metrics that measure impact.
The Lower-Funnel Trap
Earlier in my career, I was heavily focused on lower-funnel performance. Paid search, retargeting, conversion rate optimisation. The metrics were tight, the attribution was (apparently) clear, and the results were easy to defend to clients. I thought I was running high-performance marketing.
What I understand now, having managed hundreds of millions in spend across thirty-odd industries, is that much of what lower-funnel performance gets credited for was going to happen anyway. The person who typed a branded search term into Google was already a near-certain buyer. The retargeting ad that followed someone around the internet for a week may have done nothing except remind them of something they had already decided to buy. The click happened. The conversion happened. The attribution model said the campaign worked. But the demand existed before the campaign touched it.
Think about a clothes shop. Someone who tries something on is many times more likely to buy than someone browsing the rails. If you only measure purchase transactions, you might conclude that the fitting rooms are irrelevant because no sale happens there. You would be wrong. The fitting room is where the decision gets made. The till is just where the money changes hands.
Lower-funnel marketing is the till. It is important. But if you only measure what happens at the till, you will never understand what is actually driving your growth, and you will keep cutting the budget for everything that happens before it.
Forrester has written thoughtfully about how measurement frameworks can undermine the buyer’s experience when they over-index on the final touchpoint. It is worth reading if you are responsible for any kind of attribution model.
What Constant Measurement Actually Looks Like in Practice
Constant measurement does not mean checking your dashboard every hour. It means building a rhythm of honest evaluation that is tied to decision-making rather than reporting cycles.
In practice, it looks like this. You set clear objectives before a campaign launches, not after. You define what success looks like in business terms, not just marketing terms. You agree on which metrics you will use to evaluate performance, and you do not change them halfway through because the original ones are not looking good. You review performance regularly enough to course-correct, but not so frequently that you are reacting to noise. And when a campaign ends, you do an honest post-mortem that includes what did not work, not just what did.
I have judged the Effie Awards, which are specifically about marketing effectiveness rather than creative execution. One thing that stands out across the entries that win is the quality of the measurement thinking. The best campaigns do not just show results. They show that the team understood what they were trying to achieve, measured the right things to evaluate whether they achieved it, and were honest about the relationship between their activity and the outcome. That kind of rigour is rare, and it shows.
For teams using GA4, the transition from Universal Analytics has created both challenges and opportunities for more honest measurement. Moz has a useful piece on the GA4 features worth paying attention to, and separately on using GA4 for directional reporting rather than treating every number as precise. That framing, directional rather than definitive, is exactly the right way to think about most analytics data.
The Honest Approximation Principle
Perfect measurement does not exist. Attribution is always a model, not a fact. Customer journeys are non-linear, multi-device, and influenced by things that never appear in your analytics platform. Anyone who tells you their measurement framework gives them a precise picture of marketing’s contribution to revenue is either mistaken or selling you something.
That does not mean measurement is pointless. It means the goal is honest approximation, not false precision. You are trying to build a credible picture of what is working and what is not, good enough to make better decisions than you would make without it. That is a realistic and genuinely valuable objective.
Forrester’s perspective on aligning sales and marketing measurement without making them identical is useful here. Sales and marketing are measuring different things at different points in the customer experience. The mistake is pretending one set of metrics captures the full picture.
When I was turning around a loss-making agency, one of the first things I did was strip back the reporting to the metrics that were genuinely connected to commercial performance. Revenue per client. Margin by account. New business conversion rate. Client retention. Not because the other metrics were wrong, but because the business needed to understand what was actually driving its financial health, and the existing measurement framework was producing a lot of noise and very little signal.
The same principle applies to marketing measurement. If your reporting framework has forty metrics in it, you probably have no framework at all. You have a data dump. Constant measurement means consistently measuring the right things, not everything.
Why Value-Oriented Marketers Measure More Than Their Peers
There is a correlation between measurement discipline and commercial credibility. Marketers who measure consistently, who can speak to business outcomes rather than just activity metrics, who are honest about uncertainty while still making defensible recommendations, tend to earn more trust from the leadership teams they work with.
That matters practically. Marketing budgets are discretionary in the eyes of most finance directors. When times get difficult, the functions that cannot demonstrate their contribution to the business are the ones that get cut first. I have seen this happen repeatedly. Marketing teams that had been producing impressive-looking reports for years found themselves unable to make a credible case for their budget because the reports were full of activity metrics with no clear line to revenue.
Value-oriented marketers measure constantly because they understand that measurement is not just about optimising campaigns. It is about building the evidence base that justifies the function’s existence and earns the right to invest in longer-term, harder-to-measure brand-building activity. You cannot make the case for investing in upper-funnel brand work if you cannot demonstrate that your lower-funnel spend is efficient. The measurement disciplines are connected.
Email is one area where the measurement habit is easy to build and the data is relatively clean. HubSpot’s guide to email marketing reporting covers the metrics worth tracking and, more usefully, how to interpret them in context rather than in isolation. Open rates look very different depending on list quality, send frequency, and audience segment. That kind of contextual reading is what separates useful measurement from number-watching.
MarketingProfs has also covered the fundamentals of using web analytics to inform marketing decisions in a way that has aged reasonably well. The tools have changed. The underlying discipline has not.
If You Fixed Measurement, Most of Marketing Would Fix Itself
This is a strong claim, but I believe it. If businesses could genuinely measure the true impact of their marketing activity on business performance, a significant proportion of what they are currently spending would be reduced or redirected. Not because marketing does not work, but because a lot of what gets called marketing is activity that would happen regardless of whether it was working, because nobody has built the measurement framework to know either way.
Fix measurement, and you create accountability. Accountability creates pressure to focus on what actually works. Pressure to focus on what works creates better briefs, smarter channel choices, and more honest conversations about what marketing can and cannot do for a business. The measurement discipline is the foundation that everything else sits on.
The marketers I have seen build genuinely strong careers, the ones who ended up with real influence over business strategy rather than just marketing execution, were almost always the ones who took measurement seriously early. Not because they were data scientists. Because they were commercially curious enough to want to know whether what they were doing was working, and honest enough to change course when the evidence said it was not.
If you want to build that kind of measurement practice, the Marketing Analytics and GA4 hub is a good place to work through the discipline systematically, from the foundational questions to the practical frameworks for building measurement into how your team operates day to day.
About the Author
Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.
