Google Analytics Goals: What They Can’t Track
Google Analytics Goals, and their GA4 equivalent conversion events, cannot track offline conversions, cross-device user journeys, phone call outcomes, in-store behaviour, or any interaction that happens outside the browser session. They also cannot attribute revenue to the full sequence of touchpoints that led to a conversion, capture what users intended but did not do, or measure the quality of a conversion after it happens.
That is not a criticism of the tool. It is a description of its boundaries, and those boundaries matter enormously if you are using Goals data to make budget decisions.
Key Takeaways
- Google Analytics Goals track on-site behaviour only. Any conversion that begins or ends outside the browser session is invisible to the platform by default.
- Cross-device journeys are systematically undercounted. A user who researches on mobile and converts on desktop will often appear as two separate, unconnected sessions.
- Goals measure that a conversion happened, not why it happened or what it was worth. Post-conversion quality, customer lifetime value, and churn are outside its scope.
- Referral data loss, bot traffic, and implementation inconsistencies mean Goals figures are directional signals, not precise counts. Treating them as exact numbers is a measurement error in itself.
- GA4 conversion events extend some capabilities but do not resolve the fundamental gaps around offline behaviour, intent, or multi-touch attribution across long sales cycles.
In This Article
- Why the Gaps in Goals Data Are Structural, Not Technical
- Offline Conversions and Phone Call Outcomes
- Cross-Device User Journeys
- In-Store Behaviour and Physical World Interactions
- Post-Conversion Quality and Customer Lifetime Value
- Intent and Untracked Behaviour
- Referral Data Loss and Session Attribution Distortion
- Multi-Touch Attribution Across Long Sales Cycles
- Emerging Channels and Non-Standard Conversion Paths
- What to Do With These Gaps
I have spent a significant part of my career sitting across the table from clients who treat their Google Analytics numbers as ground truth. When I was running agencies, I watched performance teams optimise campaigns toward Goal completions with complete confidence, only to discover months later that the conversions they were chasing bore almost no relationship to actual revenue. The tool was working exactly as designed. The problem was the assumption that it was telling the whole story.
Why the Gaps in Goals Data Are Structural, Not Technical
Most conversations about Google Analytics limitations focus on implementation problems: missing tags, incorrect filters, untracked subdomains. Those are real issues, but they are fixable. The gaps I am describing here are structural. They exist because of what Goals were designed to do, which is measure discrete on-site events within a browser session. That design choice creates hard limits that no amount of correct implementation will resolve.
If you want a broader grounding in how analytics data should be interpreted across the measurement stack, the Marketing Analytics hub covers the full picture, including GA4, attribution modelling, and how to build a measurement framework that does not collapse when the data gets messy.
The structural gaps fall into several categories. Understanding them is the first step toward building a measurement approach that compensates for them, rather than pretending they do not exist.
Offline Conversions and Phone Call Outcomes
Google Analytics Goals cannot track what happens after a user leaves your website. If someone fills in a contact form and then calls your sales team, the form submission is tracked. The phone call, the sales conversation, and whether that conversation resulted in a closed deal are not. If your business model involves any human interaction between initial contact and revenue, you have a measurement gap by default.
This is particularly acute in B2B, professional services, high-ticket retail, and any sector where the sales cycle extends beyond a single session. I managed accounts in financial services and automotive for years where the average time from first website visit to signed contract was measured in weeks. During that period, the prospect would call multiple times, meet with a consultant, receive documents by email, and attend a product demonstration. Google Analytics would record the initial visit and, if configured correctly, the eventual form submission. Everything in between was a black box.
Call tracking platforms can bridge part of this gap by assigning dynamic phone numbers to sessions and passing call data back into GA4. But call tracking tells you a call happened and how long it lasted. It does not tell you what was discussed or whether the call converted to a sale. For that, you need CRM integration, and most businesses have not built that connection properly.
Cross-Device User Journeys
Cookie-based analytics platforms, including GA4 in its default configuration, assign identifiers to browser sessions rather than to people. A user who discovers your brand on a mobile device, researches further on a work laptop, and converts on a home desktop will appear in your data as three separate users with three separate sessions. The Goal completion will be attributed to the final session only, and the preceding touchpoints will receive no credit.
GA4 introduced User-ID tracking and Google Signals to partially address this. User-ID works when users are logged in across devices, which requires a login system and user consent. Google Signals uses aggregated, anonymised data from users who have opted into ads personalisation, and it cannot be used for individual-level reporting. In practice, for most businesses, cross-device attribution remains significantly incomplete.
This matters most for brands with long consideration cycles. If you are running top-of-funnel campaigns on mobile and bottom-of-funnel campaigns on desktop, your Goals data will systematically overvalue the desktop campaigns and undervalue the mobile ones. Budget decisions made on that basis will consistently defund the channels that are actually building demand. I have seen this pattern repeated across multiple agency relationships, and it is one of the reasons attribution theory in marketing deserves more serious attention than most teams give it.
In-Store Behaviour and Physical World Interactions
Google Analytics operates entirely within the digital environment. Any interaction that occurs in a physical space is invisible to it. If your website drives footfall to a store, that footfall is not captured in Goals unless you have built a specific mechanism to connect them, such as a store locator click event, a click-to-map interaction, or an in-store loyalty programme with digital touchpoints.
Even with those mechanisms in place, you are measuring a proxy behaviour rather than the actual outcome. The number of people who clicked your store locator is not the same as the number of people who visited your store. The number of people who visited your store is not the same as the number who made a purchase. Each step in that chain requires a separate data source, and connecting them into a coherent measurement framework is genuinely difficult work that most analytics implementations have not done.
Retailers who invest in proper omnichannel measurement, connecting point-of-sale data to digital campaign data through loyalty programmes or matched panels, get a materially more accurate picture of what their marketing is doing. But that is a significant infrastructure investment, and it sits entirely outside what Google Analytics Goals can provide on their own.
Post-Conversion Quality and Customer Lifetime Value
A Goal completion tells you that a defined event occurred. It does not tell you anything about the quality or value of what followed. A lead form submission is a lead form submission whether the person who submitted it became a six-figure client or unsubscribed from your email list within 48 hours. An e-commerce transaction is recorded as a conversion whether the order was returned the following day or the customer went on to purchase twelve more times over the next three years.
This is not a minor limitation. If you are optimising paid media campaigns toward Goal completions without any signal about post-conversion quality, you are training your algorithms on incomplete data. Smart bidding in Google Ads, for example, will optimise toward whatever conversion signal you provide. If that signal is “form submitted” rather than “qualified lead” or “closed deal”, the algorithm will find the cheapest path to form submissions, which is not necessarily the path to revenue.
Passing enhanced conversion values back into GA4 and your ad platforms, connected to CRM data about actual deal outcomes, is the correct approach. It requires CRM integration, clean data pipelines, and a commitment to maintaining them. Most businesses I have worked with have not built this, which means their optimisation loops are running on a distorted signal. This connects directly to the broader question of which KPIs are most likely to be vanity metrics, because a Goal completion rate that has no relationship to revenue is precisely that.
Intent and Untracked Behaviour
Google Analytics Goals can only track what users do. They cannot track what users intended to do, what prevented them from doing it, or what they were thinking when they made a decision. A user who abandoned a checkout at the payment stage looks identical in Goals data to a user who abandoned because the product was out of stock, a user who abandoned because the delivery cost was too high, and a user who abandoned because they were interrupted and intended to return later.
Understanding why users behave the way they do requires qualitative research, session recording, and on-site surveys. Tools like Hotjar complement Google Analytics precisely because they capture the context behind the numbers. GA4 tells you that 68% of users dropped off at step three of your checkout. Hotjar shows you that most of them were scrolling back and forth on the delivery options section before leaving. Those are two very different pieces of information, and only one of them tells you what to fix.
I have always been sceptical of analytics reviews that consist entirely of numbers without any qualitative layer. When I was growing an agency from 20 to around 100 people, one of the disciplines I tried to instil was the habit of watching users actually interact with client sites before drawing conclusions from the data. The numbers tell you where the problem is. Watching real people tells you what the problem is. Google Analytics Goals, however well configured, only give you the first half of that picture.
Referral Data Loss and Session Attribution Distortion
Even within the on-site sessions that Google Analytics does track, the attribution of those sessions to traffic sources is subject to systematic distortion. Referral data loss occurs when the HTTP referrer header is not passed correctly, which happens regularly with HTTPS-to-HTTP transitions, certain browser privacy settings, link shorteners, and social media apps that open links in in-app browsers. When referral data is lost, traffic is typically classified as direct, which inflates direct traffic figures and deflates the channels that actually drove the visit.
Bot traffic, unless filtered correctly, inflates session counts and can trigger Goal completions artificially. UTM parameter inconsistency, where campaigns are tagged inconsistently or not at all, creates fragmented channel data that makes it impossible to compare performance accurately across time periods or campaigns.
The cumulative effect of these distortions is that even the data Google Analytics does capture is an approximation. I have held this view for a long time: GA, GA4, Adobe Analytics, and Search Console all provide perspectives on reality, not reality itself. Trends and directional movement are what matter. Treating any specific number as precise is a mistake the tool’s design does not warrant.
This is also why Google Analytics alternatives have continued to attract serious attention. Platforms like Heap approach data collection differently from GA, capturing all interactions by default rather than requiring pre-configured events, which changes the nature of what gets measured and what gets missed.
Multi-Touch Attribution Across Long Sales Cycles
Google Analytics Goals, in their standard configuration, attribute conversions using a last-click or last non-direct click model. This means that whatever touchpoint immediately preceded the Goal completion receives full credit for the conversion. Every other touchpoint in the experience receives nothing.
For short, simple purchase journeys, this is a reasonable approximation. For anything more complex, it is a systematic misrepresentation of how marketing actually works. A user who saw a display ad three weeks ago, read a blog post last week, clicked a paid search ad yesterday, and converted today will be recorded as a paid search conversion. The display campaign and the content that built familiarity and consideration receive no credit and will be defunded accordingly.
GA4 offers data-driven attribution as a model, which distributes credit across touchpoints based on observed conversion patterns. This is a genuine improvement. But it still operates only on the touchpoints that GA4 can see, which excludes offline interactions, cross-device sessions where users are not identified, and any channel that does not pass data into the GA4 property. If you are running email campaigns, affiliate programmes, or any form of offline marketing, those channels are likely underrepresented in even the most sophisticated attribution model GA4 can produce.
Measuring affiliate marketing incrementality is a good example of where GA4 Goals fall short in practice. Last-click attribution systematically overstates affiliate contribution because affiliates often operate at the bottom of the funnel, capturing users who would have converted anyway. Incrementality testing requires a different methodology entirely, one that GA4 Goals are not built to support.
Emerging Channels and Non-Standard Conversion Paths
Google Analytics Goals were designed around a relatively conventional web experience: user visits site, user completes an action, action is recorded. As marketing has expanded into new channels and formats, the limitations of that model have become more visible.
AI-generated content in search results, for example, is beginning to intercept queries before users reach a website at all. If a user finds the answer to their question in a generative AI summary and never clicks through to your site, that interaction is invisible to GA4. Measuring the impact of your brand’s presence in these environments requires entirely different approaches. Understanding how to measure the success of generative engine optimisation campaigns is becoming a legitimate measurement challenge that sits completely outside what Goals can address.
Similarly, newer formats like AI avatar marketing create attribution questions that standard Goals configurations are not equipped to handle. If a user interacts with an AI avatar on a third-party platform and then converts on your website days later, the connection between those two events requires custom tracking architecture. The question of how to measure the effectiveness of AI avatars in marketing is a useful illustration of how measurement frameworks need to evolve alongside the channels they are supposed to be measuring.
Custom event tracking in GA4 extends what is possible significantly compared to the older Goals system. GA4 custom event tracking allows teams to capture a much wider range of interactions, and GA4 intelligence events can surface anomalies automatically. But these capabilities still operate within the same structural constraints: they measure what happens on your digital properties, within sessions that GA4 can identify, through channels that pass data correctly. The boundaries of the tool have not changed. The precision within those boundaries has improved.
What to Do With These Gaps
The answer is not to abandon Google Analytics Goals or to treat the data as useless. The answer is to build a measurement framework that uses GA4 data for what it is genuinely good at, which is understanding on-site behaviour, identifying friction points, and tracking directional trends, while supplementing it with data sources that cover the gaps.
CRM integration connects post-conversion quality data back to campaign performance. Call tracking bridges phone-based conversions. Customer surveys and session recording tools capture intent and qualitative context. Incrementality testing and matched market experiments provide causal evidence that attribution models cannot. A properly structured inbound marketing ROI framework, for example, should pull from multiple data sources, not rely on GA Goals as its single source of truth.
The businesses that get measurement right are not the ones with the most sophisticated analytics configuration. They are the ones that are honest about what their data can and cannot tell them, and who build decision-making processes that account for that uncertainty rather than pretending it does not exist.
I spent time judging the Effie Awards, which are specifically about marketing effectiveness, and the entries that stood out were not the ones with the most impressive dashboards. They were the ones where the teams had thought carefully about what they were measuring, why it mattered, and how confident they could reasonably be in their conclusions. That kind of intellectual honesty about measurement is rarer than it should be.
If you are building or reviewing your analytics stack, the Marketing Analytics hub is a useful reference point for thinking about measurement more systematically, covering everything from GA4 implementation to attribution modelling and how to connect digital data to commercial outcomes.
You can also run A/B testing within GA4 to validate hypotheses about user behaviour, which is one of the more practical ways to generate directional insight from the data you do have, even when that data has the limitations described above.
About the Author
Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.
