GA4 vs Universal Analytics: What Changed and Why It Matters
GA4 is not a new coat of paint on Universal Analytics. It is a fundamentally different measurement system, built on a different data model, with different defaults, different logic, and different implications for how you interpret your marketing performance. If you migrated to GA4 and simply recreated your old UA reports, you did the migration without doing the work.
The short version: Universal Analytics was session-based and cookie-dependent. GA4 is event-based and designed for a world where cookies are unreliable, users move across devices, and privacy regulation has reshaped what data you can collect. The reports look different because the underlying model is different, not because Google changed the interface for the sake of it.
Key Takeaways
- GA4 uses an event-based data model. Universal Analytics used sessions. This is not a cosmetic difference, it changes how almost every metric is calculated.
- Bounce rate in UA and engagement rate in GA4 are not the same metric. Treating them as equivalent will give you misleading trend data.
- GA4’s cross-device tracking and Google Signals integration make user-level analysis more reliable, but only if you configure them correctly.
- Custom event tracking in GA4 is more flexible than UA goals, but that flexibility requires discipline. Poorly defined events produce data that looks complete but tells you nothing.
- The migration window is closed. Universal Analytics data is gone. The only productive question now is whether you are using GA4 properly.
In This Article
- What Was Universal Analytics Actually Measuring?
- How GA4’s Event-Based Model Changes the Calculation
- Cross-Device Tracking and Why It Changes User Analysis
- Goals vs Conversions: A More Than Cosmetic Change
- The Reporting Interface and Why It Disorients People
- What GA4 Does Better Than UA (and Where It Still Falls Short)
- The Migration Is Done. The Work Is Not.
I want to be direct about something before we get into the mechanics. When Google announced the UA sunset, a lot of marketing teams treated the migration as a technical checkbox. Install the tag, recreate the goals, move on. I saw this pattern repeatedly across the agency work I was doing at the time, and it worried me then. It worries me more now, because those teams are sitting on GA4 data they do not fully trust and dashboards that are measuring the wrong things with the wrong logic. If any of that sounds familiar, this article is for you.
What Was Universal Analytics Actually Measuring?
Universal Analytics was built around the session. A session was a container, a defined window of activity on your site, typically lasting until 30 minutes of inactivity. Everything was counted relative to sessions: pageviews per session, goal completions per session, bounce rate as a percentage of single-page sessions. The user was secondary. The session was the unit of measurement.
This model worked reasonably well in a world where most users browsed on a single device, cookies were reliable, and the web was relatively simple. It had real limitations even then. A user who visited your site, read a 4,000-word article, and left without clicking anything was counted as a bounce. A user who visited twice in the same day from the same browser generated two sessions but was not reliably identified as the same person across those sessions.
UA also relied heavily on third-party cookies for attribution and cross-session tracking. As browsers began restricting those cookies, the data quality degraded. Safari’s Intelligent Tracking Prevention, Firefox’s enhanced privacy defaults, and eventually Chrome’s own changes all chipped away at the reliability of UA data long before Google pulled the plug on the platform itself.
The metric that most marketers anchored to, bounce rate, was one of the most misunderstood numbers in digital analytics. A high bounce rate was treated as a problem. Sometimes it was. Often it was not. If someone found your contact page through a branded search, got your phone number, and called you, that was a successful visit that UA recorded as a bounce. The metric was a proxy for engagement that frequently measured the wrong thing.
How GA4’s Event-Based Model Changes the Calculation
GA4 scrapped the session-first model and replaced it with events. Every interaction, a pageview, a scroll, a click, a video play, a form submission, is an event. Sessions still exist in GA4, but they are derived from events rather than being the primary container. The user is the central unit, not the session.
This matters because it changes what you can measure and how you measure it. In UA, you had to configure goals to track specific actions. In GA4, a much wider range of interactions are tracked automatically through enhanced measurement, and you can define custom events for anything else. The Moz team has written a useful breakdown of GA4 custom event tracking for SaaS products that illustrates how granular this can get when you configure it properly.
The replacement for bounce rate is engagement rate. An engaged session in GA4 is one that lasts longer than 10 seconds, includes a conversion event, or includes two or more pageviews. Engagement rate is the percentage of sessions that meet at least one of those criteria. This is a more nuanced measure of quality than bounce rate, but it is not directly comparable to UA’s bounce rate. If you are looking at historical comparisons and treating these metrics as equivalent, your trend analysis is wrong.
The average time on page metric has also changed significantly between UA and GA4. In UA, time on page was calculated based on the gap between timestamps of successive pageviews. If a user visited one page and left, UA recorded zero time on that page because there was no subsequent pageview to timestamp against. GA4 calculates session duration differently, using the event model. The numbers will not match, and they should not be expected to.
If you want to go deeper on the full analytics landscape, including how GA4 fits into a broader measurement stack, the Marketing Analytics hub covers attribution, data quality, and measurement strategy across the topics that matter most for commercial marketing teams.
Cross-Device Tracking and Why It Changes User Analysis
One of the genuine improvements in GA4 is its approach to cross-device and cross-platform tracking. Universal Analytics was largely device-specific. If a user visited your site on their phone on Monday and converted on their laptop on Thursday, UA typically counted those as two separate users. The attribution was incomplete, and the user experience was fragmented.
GA4 uses a hierarchy of identity signals to stitch these journeys together. If a user is logged into a Google account, GA4 can use that User ID to connect activity across devices. Google Signals extends this further by using data from users who have opted into ads personalisation. Device ID and modelled data fill in the gaps where neither of those signals is available.
This is a meaningful improvement for any business where the customer experience spans multiple touchpoints and devices. In agency work, I spent years trying to explain to clients why their analytics showed a conversion path that made no sense, because UA was showing them fragments of a experience rather than the whole thing. The cross-device capabilities in GA4 do not solve this problem entirely, but they get closer to reality than UA ever did.
The caveat is that Google Signals requires configuration and has data thresholds. If your traffic volumes are relatively low, GA4 may apply thresholds that limit reporting granularity to protect user privacy. You will see this as “(other)” rows in your reports. It is not a bug. It is the privacy model working as intended. You need to account for it in your analysis.
Goals vs Conversions: A More Than Cosmetic Change
In Universal Analytics, you configured Goals. A goal was a specific action you wanted to track, a destination URL, a session duration, a pageview count, or an event. UA allowed a maximum of 20 goals per property. Once you hit that limit, you had to decide which goals to retire or create a new property.
GA4 replaced goals with Conversions, which are simply events that you have marked as important. Because GA4 is event-based, any event can be designated as a conversion. There is no hard cap in the same way. This gives you significantly more flexibility, but flexibility without structure creates noise.
I have seen GA4 setups where teams have marked 40 or 50 events as conversions because they could. The result is a conversion report that is impossible to interpret because every minor interaction is weighted equally alongside actual business outcomes. The tool gave them more rope and they used it to tie themselves in knots.
The discipline required here is the same discipline that good measurement always requires: decide what a conversion means for your business before you configure your analytics. A conversion should represent a meaningful step toward revenue or a clearly defined business outcome. If you are marking every button click as a conversion, you are producing data volume, not data quality.
The Reporting Interface and Why It Disorients People
A significant part of the frustration with GA4 is not the data model. It is the interface. Universal Analytics had a left-hand navigation that most digital marketers had spent years working with. Audience reports, Acquisition reports, Behaviour reports, Conversions. The structure was familiar even if the underlying metrics were imperfect.
GA4 ships with a set of standard reports that do not map directly to UA’s structure. The Explore section, which is where you build custom analyses, is more powerful than anything in UA, but it has a steeper learning curve. Many teams got to GA4, could not find the report they were used to, and concluded that the platform was broken rather than different.
The Semrush guide on setting up Google Analytics correctly is worth reading if you are still in the configuration phase. Getting the property structure, data streams, and reporting identity right at setup prevents a lot of downstream confusion.
For teams that want to understand what users are actually doing on their site beyond what GA4 captures, pairing GA4 with a behavioural analytics tool adds a layer of qualitative context. Hotjar’s overview of how it complements Google Analytics explains the logic well. GA4 tells you what happened. Behavioural tools help you understand why.
Crazy Egg has also published a useful breakdown of Google Analytics features and limitations that is worth bookmarking if you are building out your analytics stack and want a clear-eyed view of where GA4 ends and other tools begin.
What GA4 Does Better Than UA (and Where It Still Falls Short)
GA4 is a better measurement platform than Universal Analytics for most modern marketing use cases. That is not a diplomatic position. It is a practical one. The event-based model is more flexible. The cross-device tracking is more accurate. The integration with BigQuery for raw data export is available on free properties, which was not the case with UA. The predictive metrics, purchase probability and churn probability, give you signals that UA simply could not generate.
The content strategy applications are also stronger. Moz has a good piece on using GA4 data to inform content decisions that shows how the event model surfaces engagement signals that UA’s pageview-centric model obscured.
Where GA4 still frustrates experienced analysts is in the standard reporting layer. The default reports are designed for a broad audience and do not always surface the dimensions and metrics that experienced analysts want quickly. You end up spending more time in Explore, which is powerful but slower for routine reporting. Teams that have built Looker Studio dashboards connected to GA4 have largely solved this, but it requires an investment that smaller teams may not have made.
Data sampling is also a live issue. GA4 applies sampling in Explore reports when data volumes are high, which can produce estimates rather than exact counts. This is not new to Google Analytics, UA had the same issue, but it catches people off guard when they are building custom analyses and the numbers do not reconcile with other data sources.
The comparison between GA4 and alternatives like Heap is worth understanding if you are evaluating your analytics stack. Crazy Egg’s comparison of Heap and Google Analytics gives a fair account of where the free tool ends and where paid alternatives start to earn their cost.
The Migration Is Done. The Work Is Not.
Universal Analytics stopped processing data in July 2023 for standard properties. The historical data was available for a period after that, but it is now gone. There is no going back, and there is no meaningful value in continuing to compare GA4 data against UA data as if they are measuring the same things. They are not.
The productive question is not whether GA4 is better or worse than UA in the abstract. It is whether your GA4 implementation is set up to give you reliable data, whether your team understands what the metrics mean, and whether the reports you are looking at are actually connected to your business objectives.
Early in my career, when I was building my first website because the MD would not give me budget to hire someone, I learned something that has stayed with me for 25 years: the tool is only as useful as your understanding of what it is doing. I taught myself to code not because I wanted to be a developer, but because I wanted to understand what I was building. The same logic applies to analytics. If you do not understand what GA4 is measuring and how it is measuring it, you are not doing analytics. You are looking at numbers and hoping they mean something.
The teams getting the most from GA4 right now are the ones who audited their event taxonomy, defined their conversions with commercial logic rather than technical convenience, connected GA4 to BigQuery for unsampled data, and built reporting layers in Looker Studio that surface the metrics their business actually cares about. That is not a complicated list. It is just disciplined work.
The Marketing Analytics hub at The Marketing Juice covers the measurement topics that sit around and beyond GA4, from attribution models to data quality to the metrics that belong on a senior marketer’s dashboard. If you are rethinking your measurement approach, it is a useful place to start.
The Hotjar resource on combining Hotjar with Google Analytics is also worth revisiting if you are trying to build a more complete picture of user behaviour than GA4 alone provides. Quantitative and qualitative data answer different questions. Both matter.
About the Author
Keith Lacy is a marketing strategist and former agency CEO with 20+ years of experience across agency leadership, performance marketing, and commercial strategy. He writes The Marketing Juice to cut through the noise and share what works.
