Competing on Analytics: Why Most Businesses Are Playing the Wrong Game

Competing on analytics means using data as a structural advantage, not a reporting function. It means building the capability to make faster, better-calibrated decisions than your competitors, consistently, across every channel and every budget cycle. Most businesses say they do this. Very few actually do.

The gap between companies that use analytics and companies that compete on analytics is not a technology gap. It is a thinking gap. The tools are largely the same. What differs is the discipline around how data gets used, challenged, and acted on.

Key Takeaways

  • Analytics is a competitive advantage only when it changes decisions, not when it generates reports that confirm existing ones.
  • Most businesses are data-rich and insight-poor. The problem is rarely collection, it is interpretation and action.
  • GA4 and similar tools give you a perspective on reality, not reality itself. Treating them as ground truth is a structural error.
  • UTM discipline, combined with behavioural data, closes most of the measurement gaps that teams blame on attribution complexity.
  • The companies that win on analytics build a culture of honest approximation rather than chasing false precision.

What Does It Actually Mean to Compete on Analytics?

The phrase gets used loosely. In practice, competing on analytics means your organisation has built a repeatable process for turning data into decisions that improve commercial outcomes. Not dashboards. Not reports. Decisions that change what you spend, where you spend it, what you say, and to whom.

I spent a number of years running agencies where the analytics conversation was almost always framed around capability. Which platform. Which attribution model. Which reporting stack. The organisations that actually outperformed their peers were not the ones with the most sophisticated tooling. They were the ones where someone in the room had the authority and the habit of asking: what does this data tell us to do differently?

That question sounds obvious. It is surprisingly rare in practice. Most analytics conversations in marketing end at description. What happened. How many sessions. What the conversion rate was. The analytical edge comes from the next step: why, and what now.

If you want a broader grounding in the tools and frameworks that support this kind of thinking, the Marketing Analytics hub covers the full landscape, from GA4 setup to measurement strategy.

Why Most Businesses Are Playing the Wrong Game

There is a version of analytics adoption that looks like competitive advantage but is not. It involves implementing GA4, connecting it to a CRM, building a Power BI dashboard, and presenting weekly numbers to a leadership team. This is infrastructure. It is necessary but not sufficient.

The wrong game is optimising for data volume rather than decision quality. It produces organisations that are simultaneously drowning in metrics and genuinely uncertain about what is working. I have sat in boardrooms with clients who had more data than they could read and less clarity than they needed. The analytics stack had grown faster than the analytical culture.

BCG published research on this dynamic in financial services, finding that the institutions that generated the most value from analytics were not those with the most data scientists. They were the ones that embedded analytical thinking into commercial decision-making at every level. The BCG analysis of analytics maturity in financial institutions is worth reading if you want the structural argument made rigorously. The principle applies across industries.

The right game is narrower and harder. It means identifying the three or four decisions in your marketing operation that, if made better, would have the most material impact on revenue. Then building the measurement infrastructure specifically to support those decisions. Not the other way around.

The Measurement Gaps That Undermine Competitive Advantage

Before you can compete on analytics, you need to be honest about the quality of your data. This is uncomfortable territory for most marketing teams, because the instinct is to present numbers with confidence rather than caveat them with uncertainty. But false precision is worse than acknowledged approximation. It sends organisations in the wrong direction with conviction.

GA4 is not a neutral observer. It samples, models, and estimates. It misses sessions, misattributes conversions, and handles cross-device behaviour imperfectly. This is not a criticism of Google. It is a structural feature of how web analytics works. Understanding why GA4 data is inherently imperfect is a prerequisite for using it well, not a reason to distrust it entirely.

The most common measurement gaps I see are:

  • Inconsistent or absent UTM tagging across paid channels, which collapses paid traffic into direct and makes channel-level analysis meaningless
  • No behavioural layer alongside quantitative data, so you know what users did but not why
  • Attribution models chosen for convenience rather than accuracy, producing ROAS figures that flatter last-click channels and undervalue upper-funnel activity
  • Conversion events that measure activity rather than value, such as form submissions counted equally regardless of lead quality

Each of these is fixable. None of them requires enterprise budget. Consistent UTM discipline, for example, costs nothing beyond process. Semrush’s guide to UTM tracking in GA4 is a practical starting point if your team needs to standardise its approach.

How Behavioural Data Changes the Analytical Picture

Quantitative analytics tells you what happened. Behavioural analytics starts to tell you why. The combination is where genuine competitive advantage lives, because it gives you a mechanism for generating hypotheses that are grounded in observed user behaviour rather than assumption.

Early in my career, I worked on a site rebuild where the analytics showed high traffic to a product page and almost no conversions. The quantitative data told us there was a problem. It told us nothing about what the problem was. We ran session recordings and heatmaps and found that users were scrolling past the pricing section without reading it, because the layout buried the price below a long block of product copy. The fix took an afternoon. The conversion rate moved materially within a week.

That is a simple example, but it illustrates the structural point. Quantitative data surfaces the problem. Behavioural data identifies the cause. You need both to make decisions worth making.

Tools like Hotjar sit alongside GA4 to provide this layer. The integration between Hotjar and Google Analytics is worth setting up properly, because it lets you move from aggregate metrics to individual session context without switching tools mid-analysis. Similarly, Hotjar’s broader approach to complementing GA data explains how the two layers interact in practice.

Building the Infrastructure That Supports Analytical Competition

Infrastructure here does not mean technology. It means the combination of tools, processes, and habits that allow data to flow reliably from collection to decision. Most organisations have the first element. The process and habit layers are where the gaps are.

When I was growing an agency from around 20 people to closer to 100, one of the hardest transitions was building analytical rigour into a team that had previously operated on instinct and experience. The instinct was good. But at scale, instinct without data produces inconsistent outcomes, because different people’s instincts point in different directions and there is no shared mechanism for resolving the disagreement.

What worked was not mandating more reporting. It was building a small number of shared metrics that everyone agreed were meaningful, and then running every significant decision through the lens of those metrics. Not as a bureaucratic exercise, but as a discipline. When someone proposed a new campaign approach, the first question was: how will we know if this worked? If the answer was vague, the proposal went back for revision.

The infrastructure elements that matter most, in rough order of priority:

  • Clean event tracking in GA4. If your conversion events are not firing correctly and consistently, everything downstream is unreliable. This is the foundation. It is not glamorous work, but it is the most important analytical work most marketing teams could do.
  • UTM standardisation across all paid and owned channels. A naming convention that everyone follows, enforced by a shared template or a URL builder tool, removes one of the most common sources of analytical noise.
  • A single source of truth for performance data. Whether that is GA4, a CRM, or a BI tool that aggregates both, the organisation needs one place where the numbers are agreed and where decisions get made. Multiple competing dashboards with different numbers produce paralysis, not insight.
  • A testing process with statistical discipline. A/B testing is only useful if you understand what a meaningful result looks like. Integrating A/B testing with GA4 closes the loop between hypothesis, experiment, and outcome measurement.
  • A regular cadence for acting on data. Weekly or fortnightly reviews where someone has the authority to change something based on what the data shows. Not just to note what the data shows.

The Cultural Dimension Most Analytics Frameworks Miss

Analytics culture is not about hiring data scientists. Most marketing teams do not need a data scientist. They need people who are comfortable with numbers, honest about uncertainty, and willing to be wrong.

The organisations that genuinely compete on analytics share a specific cultural trait: they treat data as a challenge to their assumptions rather than a confirmation of them. This is harder than it sounds. Confirmation bias in analytics is pervasive. Teams run reports until they find a cut of the data that supports the decision they had already made. This is not dishonesty. It is human nature. But it is analytically corrosive.

I judged the Effie Awards for a period, which meant reviewing effectiveness cases from some of the best-resourced marketing organisations in the world. The cases that stood out were not the ones with the most sophisticated measurement frameworks. They were the ones where the team had been willing to report honestly on what had not worked, what had surprised them, and what they had changed as a result. That intellectual honesty is itself a competitive advantage, because it produces faster learning cycles than organisations that only report success.

The MarketingProfs perspective on using web analytics effectively as a marketer touches on this, particularly the point about using data to ask better questions rather than to find better-looking answers.

Where Analytical Advantage Actually Shows Up in Commercial Outcomes

The commercial case for competing on analytics is not abstract. It shows up in specific, measurable ways.

Early in my paid search career, I ran a campaign for a music festival through lastminute.com. The campaign was not sophisticated by current standards. But because we had clean tracking, clear conversion goals, and a daily habit of checking what was working, we were able to shift budget toward the keywords and ad groups that were generating revenue and away from the ones that were not. Within roughly a day, the campaign had generated six figures of revenue. The analytical advantage was not the campaign itself. It was the speed of the feedback loop and the willingness to act on it.

That principle scales. Organisations that have built the infrastructure and culture to act on data faster than their competitors will, over time, allocate budget more efficiently, identify winning creative faster, and exit losing channels sooner. The compounding effect of those marginal improvements is significant over a 12-month budget cycle.

The specific areas where analytical advantage tends to be most commercially visible:

  • Budget allocation across channels. Organisations with clean cross-channel data can shift spend toward higher-performing channels faster. Those without it tend to maintain historical allocations regardless of current performance.
  • Creative testing and iteration. Teams that measure creative performance rigorously and retire underperforming assets quickly outperform teams that run creative on gut feel or internal preference.
  • Audience segmentation and targeting. Analytics that connect campaign data to CRM outcomes allow for progressively more refined audience targeting. This compounds over time as the data set grows.
  • Landing page and conversion rate optimisation. The combination of quantitative and behavioural data makes CRO faster and more reliable. Each test produces a learning that informs the next.

The Honest Approximation Principle

Perfect measurement does not exist in marketing. Anyone who tells you otherwise is selling something. Cross-device journeys, offline conversions, view-through attribution, the contribution of brand to performance: these are all genuinely difficult to measure with precision, and the honest answer is that you are always working with approximations.

The competitive advantage comes from having better approximations than your competitors, and from being honest about the limits of those approximations so that you do not make decisions based on false confidence.

I have seen campaigns killed because the attribution model made them look unprofitable, when the reality was that the attribution model was wrong and the campaign was actually driving significant assisted revenue. I have also seen campaigns scaled aggressively because the numbers looked good, when the reality was that the tracking was broken and the numbers were meaningless. Both errors are expensive. Both are avoidable with the right analytical discipline.

The discipline is not about finding perfect data. It is about understanding the quality of the data you have, building decisions that are strong to its imperfections, and maintaining the intellectual honesty to say “we think this is working, and here is the evidence, and here are the limitations of that evidence.”

That is what competing on analytics looks like in practice. Not a dashboard. Not a tool. A habit of thinking that treats data as a serious input to commercial decisions rather than a post-rationalisation of them.

For more on building the measurement foundations that support this kind of thinking, the Marketing Analytics hub covers everything from GA4 configuration to attribution strategy in depth.

About the Author

Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.

Frequently Asked Questions

What does competing on analytics mean for a marketing team?
Competing on analytics means using data as a structural input to commercial decisions rather than a reporting function. It requires clean measurement infrastructure, a process for acting on data quickly, and a culture that treats data as a challenge to assumptions rather than a confirmation of them. The competitive advantage comes from making better-calibrated decisions faster than your competitors, consistently across budget cycles.
How accurate is GA4 data for making marketing decisions?
GA4 data is a useful approximation, not a precise record of reality. It samples, models, and estimates in ways that introduce systematic gaps, particularly around cross-device journeys and direct traffic attribution. The right response is not to distrust it entirely, but to understand its limitations and build decisions that are strong to them. Combining GA4 with UTM tracking and behavioural data tools significantly improves the quality of the analytical picture.
What is the most important thing a marketing team can do to improve its analytics capability?
Fix the foundations before adding complexity. That means ensuring GA4 conversion events are firing correctly and consistently, standardising UTM naming conventions across all paid and owned channels, and establishing a single agreed source of truth for performance data. Most teams that struggle with analytics have a data quality problem, not a data volume problem. Reliable data from a small number of well-chosen metrics is more useful than unreliable data from a comprehensive reporting stack.
How do behavioural analytics tools complement GA4?
GA4 tells you what users did: which pages they visited, where they dropped off, which events fired. Behavioural tools like Hotjar tell you why, through session recordings, heatmaps, and scroll maps that show how users actually interacted with a page. The combination allows you to move from identifying a problem in the quantitative data to diagnosing its cause in the behavioural data, which makes optimisation faster and more reliable than either data source alone.
Can small marketing teams realistically compete on analytics against larger organisations?
Yes, and often more effectively. Analytical advantage is not primarily about budget or headcount. It is about the quality of your data, the speed of your feedback loops, and the discipline with which you act on what the data shows. Smaller teams with clean tracking, consistent UTM discipline, and a habit of acting on data weekly can outperform larger organisations that have more sophisticated tooling but slower decision-making processes. The compounding advantage of faster learning cycles is significant over a 12-month period.

Similar Posts