The LockPickingLawyer Phrase Every Analyst Needs to Hear
The LockPickingLawyer is a YouTube channel where a man picks locks. He opens padlocks, deadbolts, and high-security mechanisms that are supposed to be impenetrable, usually in under a minute, often in under thirty seconds. His most repeated phrase, delivered in a flat, unhurried voice, is: “As you can see, this lock does exactly what it’s supposed to do. It just doesn’t do what you think it does.” That sentence has been living rent-free in my head for about three years, because it describes GA4 more precisely than anything I’ve read in a marketing analytics article.
GA4 does exactly what it is supposed to do. It tracks events, models conversions, attributes sessions, and surfaces patterns in user behaviour. What it does not do is tell you what is actually happening in your business. That distinction matters enormously, and most teams miss it entirely.
Key Takeaways
- GA4 reports on a model of your data, not your data itself. Sampling, consent-based gaps, and session logic all mean the numbers are an approximation, not a census.
- The most dangerous analytics failure is not missing data. It is confidently acting on data that looks complete but is not.
- Attribution in GA4 is a convention, not a truth. The same conversion can be legitimately claimed by multiple channels depending on which model you choose.
- Most teams spend more time building dashboards than interrogating what those dashboards actually measure. The output looks rigorous. The foundation often is not.
- Good analytics practice is not about finding the perfect tool. It is about knowing precisely what your current tool can and cannot tell you.
In This Article
I have spent a lot of time inside analytics setups across a wide range of industries. When I was growing the team at iProspect from around twenty people to over a hundred, one of the recurring friction points was the gap between what clients believed their analytics were telling them and what the data could actually support. Not because the tools were broken. Because nobody had clearly defined what the tools were measuring, and more importantly, what they were not.
What Does GA4 Actually Measure?
This sounds like a basic question. It is not. Most people who use GA4 daily could not give you a clean answer to it, and that includes a lot of experienced marketers.
GA4 measures events. Everything is an event: a page view, a scroll, a click, a form submission, a purchase. Sessions are constructed from those events using a set of rules about timing and attribution windows. Users are identified through a combination of device IDs, cookies, and, where available, User-IDs. Conversions are events that you have designated as meaningful. All of this is then filtered through your consent configuration, your tag setup, and Google’s own modelling layer, which fills in gaps where consent has not been given.
That modelling layer is important. GA4 does not simply drop non-consenting users from your data and leave a blank space. It estimates their behaviour based on patterns from users who did consent. This is not a flaw. It is a deliberate design choice. But it means that when you look at your GA4 traffic numbers, you are looking at a blend of observed data and modelled data, and the dashboard does not always make that distinction obvious.
If you want to understand the mechanics of this more deeply, and why some teams are moving raw event data into BigQuery to work with it directly, Moz has a useful Whiteboard Friday on exporting GA4 data to BigQuery that covers the practical reasons clearly.
The broader point is this: GA4 is not a window onto your business. It is a model of a subset of your user interactions, constructed using a set of rules and assumptions that you may or may not have examined. That model is useful. It is often very useful. But it is not the same thing as reality.
The Lock That Does Exactly What It Is Supposed To Do
Here is where the LockPickingLawyer phrase earns its keep.
The padlock on your shed is not fraudulent. It deters casual opportunists. It satisfies an insurance requirement. It signals that the contents are not freely available. It does all of that perfectly well. What it does not do is stop someone with a tension wrench and forty seconds of patience. The problem is not the lock. The problem is believing the lock does something it was never designed to do.
GA4 is the same. It does what it is designed to do. It gives you a scalable, free, event-based analytics platform with reasonable attribution modelling and decent integration with Google’s advertising ecosystem. What it does not do is give you a complete, unbiased, consent-independent picture of every interaction your customers have with your brand. Nobody ever said it did. But plenty of teams act as though it does.
I have sat in boardroom conversations where a marketing director has presented GA4 conversion data as though it were financial reporting. The numbers had two decimal places. The slides looked authoritative. Nobody in the room asked about consent rates, about cross-device gaps, about the difference between last-click and data-driven attribution. The lock looked solid. Nobody tried to pick it.
If you are building a broader analytics practice and want to understand how GA4 sits within a wider measurement ecosystem, the Marketing Analytics hub at The Marketing Juice covers the full landscape, from attribution to reporting to the tools that complement or replace GA4 depending on your needs.
Where the Gaps Actually Live
Let me be specific about where GA4’s model diverges from reality, because vague warnings about “data limitations” are not useful. You need to know exactly where the cracks are.
Consent gaps. In markets with GDPR or similar frameworks, a meaningful percentage of your users will decline analytics cookies. Depending on your sector and your consent banner design, that could be anywhere from 15% to well over 40% of sessions. GA4’s modelling fills some of this in, but the model is trained on consenting users, who are not a random sample of your audience. People who decline cookies tend to be more privacy-conscious, often more technically sophisticated, and in some sectors, more likely to be your higher-value customers. You are not just missing data. You are potentially missing a specific type of user.
Cross-device attribution. A user sees your brand on their phone during a commute, researches further on their laptop that evening, and converts on a tablet the next morning. Unless you have a logged-in user experience and a User-ID implementation, GA4 sees three separate users and attributes the conversion to whatever touchpoint happened last on the converting device. The earlier touchpoints disappear. This is not a GA4 problem specifically. It is a measurement problem that affects every cookie-based analytics tool. But it means your channel-level attribution data is structurally biased toward bottom-of-funnel, direct, and branded search, which are the channels closest to conversion.
Attribution model selection. GA4 defaults to data-driven attribution, which sounds rigorous but is itself a model with assumptions. Switch to last click and your paid search numbers go up. Switch to first click and your display and awareness channels suddenly look more valuable. The conversion happened once. The model decides who gets credit. Search Engine Land covered the evolution of conversion tracking in Google’s ad products years ago, and the fundamental tension between simplicity and accuracy has never fully resolved. Attribution is a convention your team agrees to, not a truth the data reveals.
Offline conversion gaps. For any business with a physical component, a sales team, or a phone-based close process, GA4 is measuring the top of a funnel it cannot see the bottom of. You can import offline conversions via the API, but most teams do not, and even those that do are working with imperfect matching logic. The gap between a GA4 “lead” and a closed sale is often where the most important commercial information lives, and it is almost entirely invisible in a standard GA4 setup.
Tag implementation quality. GA4 is only as good as its implementation. Duplicate events, missing parameters, inconsistent event naming, tags that fire on staging environments, conversion events that trigger on page load rather than on actual conversion completion. I have audited setups where the reported conversion rate was roughly three times the actual rate because a purchase event was firing on the order confirmation page load, including for users who had navigated back to it from their email receipt. The numbers looked fine. Nobody had checked.
Why Teams Stop Asking the Hard Questions
This is the part that interests me most, because it is a human problem, not a technical one.
Early in my career, I asked a managing director for budget to rebuild a website. The answer was no. Rather than accepting that, I taught myself to code and built it anyway. That instinct, to go around the obstacle and understand the thing properly rather than accept the surface-level answer, is exactly what good analytics practice requires. Most teams do not do it, not because they lack the capability, but because the dashboard looks complete and confidence is comfortable.
There is also an organisational dynamic at play. When you present data that shows strong performance, nobody pushes back. When you present data that reveals gaps or uncertainty, you create work for yourself and potentially undermine a narrative that senior stakeholders have already bought into. The incentive structure in most marketing teams rewards confident reporting over honest approximation. That is a management problem more than an analytics problem, but it manifests in the analytics.
I judged the Effie Awards for several years. The entries that impressed me most were not the ones with the most impressive-looking data. They were the ones where the team had clearly thought hard about what they could and could not measure, and had built a case from multiple data sources rather than a single dashboard. That kind of intellectual honesty is rarer than it should be, and it tends to correlate with genuinely effective marketing rather than just well-reported marketing.
The Semrush overview of data-driven marketing makes a point worth noting here: the value of data is not in the volume of it, but in the quality of the questions you ask of it. That is true, but it understates the prior problem, which is knowing what the data can and cannot answer in the first place.
What Honest Analytics Practice Actually Looks Like
I want to be clear that none of this is an argument against using GA4. It is an argument for using it with your eyes open.
When I was managing significant paid search budgets across multiple markets, the discipline that mattered most was not the sophistication of the reporting. It was the habit of asking, before acting on any number, “what would have to be true for this number to be wrong?” That question is uncomfortable. It slows things down. It occasionally reveals that a campaign you thought was performing well is actually benefiting from a tracking error. But it is the question that separates analytics as a business function from analytics as a comfort blanket.
Practically, honest analytics practice involves a few specific habits.
Triangulate across sources. GA4 should not be your only data source. Cross-reference it with your CRM, your ad platform data, your email platform reporting, and where possible your server-side logs. HubSpot’s email marketing reporting guide is a reasonable starting point for understanding how email attribution works alongside web analytics, and the same principle applies across channels. When sources disagree, that disagreement is information. Do not average it away.
Document your implementation assumptions. Every GA4 setup makes choices: what counts as a conversion, what attribution window applies, how sessions are defined, which events are tracked and which are not. Those choices should be written down and reviewed periodically. When you change something, note when you changed it and why. This sounds basic. Most teams do not do it, and six months later nobody can explain why the numbers look different from last year.
Know your consent rate. If you are operating under GDPR or equivalent, you should know what percentage of your users are consenting to analytics tracking. If you do not know this number, you do not know how representative your GA4 data is. This is not optional information.
Treat attribution as a lens, not a verdict. Run multiple attribution models and look at how the story changes. If your paid social looks terrible under last-click but reasonable under data-driven, that is worth understanding rather than resolving by picking the model that tells the best story. The variation between models is often more informative than any single model’s output.
Audit your implementation regularly. At minimum, once a year, go through your GA4 setup and verify that events are firing correctly, that conversion definitions still match your business objectives, and that there are no duplicate or erroneous events inflating your numbers. This is unglamorous work. It is also some of the highest-value work an analytics team can do.
The Alternatives Question
Some teams, particularly those with strong privacy commitments or those operating in markets where consent rates are very low, have moved away from GA4 entirely or use it alongside a privacy-first alternative. Moz has a useful roundup of GA4 alternatives that covers the main options without overselling any of them.
My view is that the tool matters less than the discipline. A team that uses GA4 rigorously, understands its limitations, and triangulates its findings will produce better commercial insight than a team that switches to a premium analytics platform but applies the same uncritical approach to a different dashboard. The problem is not usually the tool. It is the habit of treating the tool’s output as ground truth.
That said, there are legitimate reasons to consider alternatives. If your consent rates are very low, the modelling layer in GA4 is doing a lot of heavy lifting and you may be better served by a server-side or cookieless solution. If you need to do complex cohort analysis or attribution modelling that goes beyond GA4’s native capabilities, exporting to BigQuery or using a dedicated analytics platform may give you more flexibility. If data residency is a concern, there are EU-hosted alternatives worth evaluating.
For teams that use visualisation tools to surface GA4 data alongside other sources, Sprout Social’s overview of Tableau integrations gives a reasonable picture of how social data can be brought into a unified reporting view, which is the direction most mature analytics setups are moving.
The Phrase Worth Keeping
I keep coming back to the LockPickingLawyer framing because it does something useful. It separates the question of whether a tool is broken from the question of whether your expectations of it are accurate. GA4 is not broken. It does exactly what it is supposed to do. The problem, when there is one, is almost always on the expectations side.
The most commercially damaging analytics failures I have seen were not caused by bad tools. They were caused by teams that had stopped asking what the tool was actually measuring. A campaign that appeared to be delivering strong return on ad spend because the attribution model was crediting it with conversions that would have happened anyway. A channel that looked like it was underperforming because its contribution was mostly happening in the dark, before the tracked touchpoints. A business that thought its customer acquisition cost was stable when it was actually rising, because the denominator in the calculation was inflated by modelled conversions.
None of those failures required a better tool. They required someone in the room willing to ask the uncomfortable question: does this number mean what we think it means?
There is more on building that kind of analytical rigour into your marketing practice across the Marketing Analytics section of The Marketing Juice, covering everything from attribution methodology to the practical mechanics of GA4 setup and reporting.
The padlock on your shed is fine. Just do not bet the house on it.
About the Author
Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.
