Clickport
Start free trial

Everything Wrong with GA4's Engagement Overview (With Data)

Show article contentsHide article contents
  1. What GA4's engagement overview actually shows you
  2. The 10-second threshold that breaks everything
  3. Bounce rate is not bounce rate anymore
  4. Average engagement time underreports by up to 80%
  5. Scroll tracking that tells you nothing
  6. What engagement data should actually look like
  7. The hidden problem: bots inflate your engagement metrics
  8. What to measure instead of GA4's engagement rate
  9. FAQ

GA4's engagement rate is the most misleading metric in modern analytics. It takes a binary 10-second threshold, inverts it, and calls the result "engagement." A visitor who stares at your loading spinner for 11 seconds and leaves is "engaged." A visitor who reads your entire article in 8 seconds flat is not. The metric measures clock time, not reading.

I spent weeks digging into how GA4's engagement metrics actually work: what they measure, what they miss, and what the data says about their accuracy. The findings are worse than I expected.

Key Takeaways
  • GA4 classifies a session as 'engaged' if it lasts 10 seconds, has 2+ pageviews, or triggers a conversion. A visitor who stares at a loading screen for 11 seconds counts as engaged.
  • A 28-day comparison study found GA4 underreports average engagement time by 54.7% on average, with some pages off by 80.2%. The cause: non-engaged sessions contribute zero time but inflate the denominator.
  • One site showed 86% bounce rate in Universal Analytics and 24% in GA4 for the same traffic in the same period. Google redefined the formula, not the name.
  • GA4's built-in scroll tracking fires a single event at 90% depth. No 25%, 50%, or 75% data. Fixing it requires Google Tag Manager, a multi-tool setup most site owners never complete.
  • Engagement rate is calculated from a mix of human and bot traffic. With 51% of web traffic automated, GA4 users without bot filtering are measuring engagement across sessions that never had a human behind them.

What GA4's engagement overview actually shows you

The Engagement overview report in GA4 is the first thing most people check when they want to know if their content is working. It shows four summary cards: average engagement time, views, event count, and user activity over time.

The problem starts with the headline metric. "Average engagement time" sounds like it answers "how long do visitors spend on my site?" It does not. GA4 divides total engagement time by active users, not sessions. A user who visits 5 times in a week with 2 minutes of engagement each time shows 10 minutes of average engagement time, not 2 minutes. The per-session metric exists ("average engagement time per session") but it is buried in Explorations, not shown in the default report.

THE DENOMINATOR TRAP
What you think it means
Total engagement time
÷ Total sessions
= Time per visit
What GA4 actually calculates
Total engagement time
÷ Active users
= Time per person (all visits combined)
As your date range widens, the number inflates because returning users accumulate more total time while still counting as 1 active user. A 7-day view and a 30-day view will show dramatically different "average engagement times" for the same audience.
Source: Google Analytics Help: Engagement metrics

Most people comparing GA4 to Universal Analytics expect a per-session metric, since UA's "Avg. Session Duration" was total duration divided by sessions. GA4's default reports prominently show the per-user version. This distinction catches most practitioners off guard.

The rest of the overview report is not much better. "Views" counts repeated views of the same page separately, so a single user refreshing a page inflates the number. "Event count" lumps auto-collected events (page_view, session_start, user_engagement) with real interactions, making the total meaningless without filtering. And "user activity over time" is just a line chart of active users, which tells you nothing about engagement quality.

The 10-second threshold that breaks everything

GA4 classifies a session as "engaged" if it meets any one of three criteria: lasted longer than 10 seconds, had 2 or more pageviews, or triggered a key event (conversion). That is the bar.

Think about what this means in practice. A visitor lands on your page. Your site takes 6 seconds to render because of heavy JavaScript. They look at the page for 5 seconds, decide it is not what they wanted, and leave. Total time: 11 seconds. GA4 verdict: engaged.

Meanwhile, someone searches for your phone number, finds it in 8 seconds, calls you, and becomes a customer. GA4 verdict: not engaged. Single pageview, under 10 seconds, no conversion event tracked.

SAME VISITOR. TWO VERDICTS.
Visitor A: "Engaged"
Landed on homepage
Page took 6s to load
Looked around for 5s
Left disappointed
Total: 11 seconds ✓
Visitor B: "Not Engaged"
Landed on contact page
Found phone number
Called and became a customer
Left satisfied
Total: 8 seconds ✗
GA4's "engaged session" threshold is 10 seconds. It does not measure reading, scrolling, clicking, or any form of actual engagement. It measures whether the session clock exceeded a fixed number.

Julius Fedorovicius, founder of Analytics Mania, put it simply: "What if I land on your website, stay for 11 seconds on a page doing nothing, and then leave? Would you consider me an engaged visitor?" He recommends increasing the threshold to at least 30 seconds because "staying 10 seconds on a page is not enough to consider this as an engagement."

You can change the threshold. GA4 lets you adjust it from 10 to 60 seconds in the tag settings. But most users never do because most users do not know it exists. And even at 60 seconds, you are still measuring clock time, not engagement.

The deeper problem is that 10 seconds is context-dependent. A 10-second visit to a recipe page where you snap a photo of the ingredients is a success. A 10-second visit to a pricing page is a failure. GA4 treats both identically.

There is no published research explaining why Google chose 10 seconds as the default. It appears nowhere in their documentation. It was simply declared as the threshold, and the entire metric chain builds on top of it: engagement rate, bounce rate, average engagement time. All of them inherit this arbitrary foundation.

Bounce rate is not bounce rate anymore

In Universal Analytics, bounce rate was simple: the percentage of sessions where a user viewed only one page and triggered no interaction events. Read an article for 12 minutes and left? Bounce. Found the phone number in 3 seconds and called? Also a bounce. It was a blunt metric, but at least everyone agreed on what it meant.

In GA4, bounce rate is the exact inverse of engagement rate. Bounce rate = 100% - Engagement rate. A session is a "bounce" only if it lasted under 10 seconds AND had fewer than 2 pageviews AND triggered no key event. Same name. Completely different formula.

Google initially removed bounce rate from GA4 entirely when it launched in October 2020. After years of backlash, they brought it back in July 2022, but with the new definition. By the time Universal Analytics was sunset in July 2023, the old bounce rate was gone forever.

The result is chaos. Root & Branch Group documented a direct comparison on their own site for the first half of 2023: Universal Analytics reported an 86.39% bounce rate. GA4 reported 23.79% for the exact same traffic in the exact same period. That is not a rounding error. It is a 62-point gap caused by a redefined formula wearing the same name.

SAME SITE. SAME TRAFFIC. SAME PERIOD.
Universal Analytics
86%
bounce rate
GA4
24%
bounce rate
The content did not improve. The traffic did not change. Google redefined the formula but kept the name. For content-heavy sites like blogs, the gap can exceed 60 percentage points.
Source: Root & Branch Group, H1 2023 data

Even the UK government noticed. GOV.UK's data documentation states bluntly: "Neither UA nor GA4 data is 'correct'. The two datasets are simply very different."

The practical damage is real. Agencies that compared year-over-year bounce rates during the UA-to-GA4 migration saw clients celebrate "improvements" that were actually just definitional changes. Marketing teams adjusted strategies based on numbers that had no real-world basis. And anyone who tries to compare old reports to new ones is looking at two entirely different metrics pretending to be the same thing.

Average engagement time underreports by up to 80%

Plausible Analytics ran a 28-day controlled comparison in March-April 2025, measuring the same traffic on the same site with both GA4 and their own time-on-page tracking. The results were stark.

GA4 underreported average engagement time by 54.7% on average. On individual pages, the gap reached 80.2%. One page showed 16 seconds in GA4 versus 1 minute 21 seconds in Plausible's measurement. Another showed 34 seconds versus 2 minutes 33 seconds.

GA4 VS ACTUAL ENGAGEMENT TIME (28-DAY STUDY)
/zucchini-fritters
GA4
0:34
Actual
2:33
-78%
/vegan-gnocchi
GA4
0:16
Actual
1:21
-80%
/corn-tortillas
GA4
0:36
Actual
1:11
-49%
Source: Plausible Analytics, 28-day comparison study, April 2025

The root cause is the denominator problem again. GA4 calculates average engagement time per session by dividing total engagement time by all sessions, including non-engaged ones that contributed zero time. In the Plausible study, approximately 50% of sessions were classified as non-engaged. Those sessions received zero engagement time in the calculation, but they were still counted in the denominator. That mechanically halves the average.

The data pipeline compounds the problem. GA4 accumulates engagement time locally in the browser and sends it as engagement_time_msec attached to the next event that fires. If no subsequent event fires (a single-page visit where the user simply closes the tab), that engagement time may be lost entirely. GA4 relies on navigator.sendBeacon() during page unload to capture the final slice of time. As David Vallejo documented, research has found approximately 40% discrepancies in data delivery when using sendBeacon with unload events. Mobile Safari is particularly unreliable because iOS can kill pages without firing any unload events at all.

The Google Analytics Community forums are full of people reporting engagement time stuck at 0 seconds even when users are clearly navigating multiple pages. One thread captures the contradiction perfectly: "engagement time shows 0s since converted to GA4 but it shows users go to various website pages."

If your content strategy depends on knowing how long people actually spend reading your pages, GA4's default metrics are not giving you the answer.

Scroll tracking that tells you nothing

GA4's built-in scroll tracking fires a single event when a visitor reaches 90% of the page depth. One event. One threshold. One time per page. That is the entire built-in scroll tracking feature.

This is binary data. Either they reached 90% or they did not. You cannot tell the difference between a visitor who read 85% of your article (deep engagement) and one who bounced after the headline (zero engagement). Both produce the same result: no scroll event.

For a 300-word landing page, 90% is nearly meaningless. Almost everyone scrolls to the bottom of a short page. For a 5,000-word guide, reaching 90% represents genuine deep engagement. GA4 treats both identically because the threshold does not adjust for page length.

GA4 SCROLL DATA VS GRANULAR SCROLL DATA
GA4 (1 threshold)
90% reached?
/blog/seo-guide Yes
/pricing Yes
/features No
That is all the data you get.
Granular scroll depth
/blog/seo-guide
25%
84%
50%
68%
75%
52%
90%
31%
Now you know where readers stop.

The drop-off curve between thresholds is where the real insight lives. If 84% of visitors reach 25% but only 52% reach 75%, you know readers are losing interest in the middle of the article. That is actionable. GA4's binary "yes/no at 90%" tells you nothing about where the problem is.

You can set up granular scroll tracking in GA4, but it requires Google Tag Manager. The process involves creating a scroll depth trigger, building a GA4 event tag, registering a custom dimension in GA4's admin, and waiting 24-48 hours for data. That is a multi-tool, multi-step process that most site owners never complete. For anyone without GTM expertise, the binary 90% event is all they have.

Chartbeat's research across 2 billion page visits found that scroll behavior varies dramatically by content type: blog posts average 40-60% completion at 75% depth, while landing pages see 60-70% reaching the bottom because they are short. A fixed 90% threshold applied uniformly across all content types is not scroll tracking. It is a checkbox.

What engagement data should actually look like

The core problem with GA4's engagement metrics is architectural. They were built as a binary classification system (engaged/not engaged) rather than a measurement system. A good engagement metric should tell you how engaged someone was, not just whether they crossed a threshold.

Here is what actually matters for understanding content performance:

A composite score, not a binary flag. Engagement is not a yes/no question. It is a spectrum. Combining scroll depth and time on page into a single 0-100 score gives you a number you can sort by, compare across sources, and track over time. "72% engagement" is more useful than "engaged: true."

Per-source engagement, visible by default. The most valuable question is not "what is my overall engagement rate?" but "which sources bring readers, and which bring clickers?" Seeing that newsletter subscribers engage at 87% while social traffic engages at 32% changes your content distribution strategy immediately. That data should be visible at a glance in your sources panel, not buried in Explorations.

Real reading time that pauses when the tab is hidden. GA4 claims to do this, but the implementation leaks time through multi-monitor scenarios (the page stays "visible" on a second screen while the user works elsewhere) and loses time through sendBeacon failures on page exit. A clean implementation listens to both visibilitychange and blur/focus events, accumulates time only when the tab is both visible and focused, and caps averages at 30 minutes to prevent forgotten tabs from warping the data.

ENGAGEMENT BY SOURCE: WHAT YOUR DASHBOARD SHOULD SHOW
72%
ChannelVisitorsEng ↓
Email / Newsletter 50 87%
Referral 97 81%
Organic Search 542 72%
Social 400 32%
Direct / None 164 24%
Social sends 8x more visitors than email, but email readers engage at nearly 3x the rate. This distinction is invisible in GA4's default reports.

A bounce rate that distinguishes reading from bouncing. GA4's bounce rate is just the inverse of engagement rate, so it inherits the same 10-second problem. A better approach uses multiple criteria: a visit is only a bounce when it has a single pageview AND no clicks AND less than 25% scroll AND under 15 seconds on page. Read the whole article in one sitting? Not a bounce. Scroll halfway and leave? Not a bounce. Land and immediately leave? Bounce. That is what the word actually means.

Copy detection. When someone selects and copies text from your page, that is the clearest signal they found something valuable. No click, no scroll threshold, no timer. They physically selected your words because they wanted to keep them. Clickport tracks this automatically. No other analytics tool does.

The hidden problem: bots inflate your engagement metrics

This is the problem nobody talks about when discussing GA4's engagement metrics. Your engagement rate is calculated from a mix of human sessions and bot sessions. Without bot filtering, the denominator is polluted.

The Imperva 2025 Bad Bot Report found that 51% of all web traffic in 2024 was automated. Bad bots accounted for 37%, up from 32% the year before. That is the sixth consecutive year of growth, driven by AI lowering the barrier to building scrapers.

GA4 uses a single mechanism for bot detection: the IAB/ABC International Spiders and Bots List. If a bot identifies itself, GA4 excludes it. If it spoofs a normal browser user-agent (which most modern bots do), GA4 counts it as a real visitor.

Here is what that does to your engagement metrics:

ENGAGEMENT RATE: WITH BOTS VS WITHOUT BOTS
With bot traffic (GA4 default)
48%
engagement rate
1,000 sessions total
200 bot sessions at 0% engagement
diluting the average
With bot traffic filtered
60%
engagement rate
800 human sessions only
Bot sessions removed before calculation
real engagement visible
Bot sessions typically last 0-1 seconds with zero scroll and zero interaction. They are always "not engaged" by any definition, dragging down the engagement rate and inflating the bounce rate for every site.

Bot sessions are the worst possible input for engagement calculations. They last 0-1 seconds. They scroll 0%. They click nothing. They are universally "not engaged" by any definition. Every bot session in your data drags down your engagement rate and inflates your bounce rate, making your content look worse than it actually performs.

The KISSmetrics 2024 analysis found that most GA4 implementations have 10-30% data gaps from ad blockers, consent mode, and configuration errors. Layer bot contamination on top of that, and the engagement numbers GA4 reports are an approximation of an approximation.

A real engagement measurement system filters bots before calculating metrics, not after. When bot sessions are excluded at ingestion time, your engagement rate reflects only human behavior. The difference is typically 10-20 percentage points, enough to change strategic decisions about which content and channels are working.

What to measure instead of GA4's engagement rate

If GA4's engagement metrics are unreliable, what should you actually track? The answer is not "a better engagement rate." It is a different framework entirely.

Scroll depth distribution, not a binary threshold. Track what percentage of visitors reach 25%, 50%, 75%, and 100% of each page. The shape of this curve tells you where your content loses readers. A steep drop between 25% and 50% means your intro promised something the body did not deliver. A gradual decline to 60-70% means the content works but could be tighter. Pages where users scroll past 50% convert at roughly double the rate of pages where they do not, but only when combined with active engagement signals like clicks and interactions.

Real time on page, per source. Not "average engagement time per active user." Not "average engagement time per session diluted by zero-time non-engaged sessions." Actual time the tab was visible and focused, per page, broken down by traffic source. When you see that organic search visitors spend 3:42 reading your guide but social visitors spend 0:48, you know where to invest.

Interaction signals beyond the clock. Clicks, form submissions, copy events, outbound link clicks. These are concrete actions that indicate value. A visitor who copies your recipe ingredients is more engaged than one who scrolled 90% at high speed. A visitor who clicked your CTA is more engaged than one who sat on the page for 2 minutes. Time and scroll are inputs. Actions are outcomes.

When high engagement is a warning sign. High engagement on a 404 page means your error page is confusing. High engagement on a checkout page means the process is complicated. High time on a FAQ page means people cannot find answers. Context matters more than the number. Any engagement metric that does not let you filter by page type is giving you a false sense of performance.

Engagement ≠ Engagement Rate
A meaningful engagement metric combines scroll depth, reading time, and interaction signals into a single score that varies from 0 to 100. It shows up on every panel. It breaks down by source. And it only counts human sessions.

GA4's engagement rate is a binary threshold dressed up as a percentage.

The difference between "62% engagement rate" and "engagement score: 72 from organic, 34 from social, 87 from email" is the difference between a vanity metric and a decision-making tool. The first tells you almost nothing. The second tells you exactly where to focus.

If you want to see what your actual engagement looks like, stripped of bot traffic, measured continuously instead of at a threshold, and broken down by every source and page: start a free trial. Two minutes to set up. No credit card. No Tag Manager. No waiting 48 hours for data.

FAQ

What is a good engagement rate in GA4?

Databox benchmarks show a median GA4 engagement rate of 56% across industries, ranging from 52% (SaaS, consulting) to 64% (e-commerce). But these numbers are inflated by the low 10-second threshold. A 56% engagement rate means 56% of sessions lasted longer than 10 seconds, which tells you very little about whether people actually engaged with your content.

What is the difference between engagement rate and bounce rate in GA4?

They are the exact same metric, inverted. Bounce rate = 100% minus engagement rate. A 60% engagement rate means a 40% bounce rate. Both are calculated from the same 10-second threshold, so neither adds independent information. In Universal Analytics, bounce rate measured single-page sessions regardless of time. In GA4, it measures sessions under 10 seconds with no second pageview and no conversion.

Why is my GA4 average engagement time 0?

GA4 reports zero engagement time when a session is classified as "not engaged" (under 10 seconds, single pageview, no conversion). Those sessions receive zero engagement time in calculations even if the visitor spent 7-8 seconds on the page. Common causes: misconfigured tracking, consent management platforms fragmenting sessions, Cloudflare Zaraz integration bugs, or simply having a lot of short-but-real visits.

Can I change the 10-second threshold in GA4?

Yes. Go to Admin, then Data Streams, select your web stream, click Configure Tag Settings, then Show More, and look for "Adjust session timeout." You can change the engaged session timer from 10 to 60 seconds. But most users never find this setting, and historical data does not retroactively update when you change it.

Does GA4 track scroll depth?

Only at a single threshold. GA4's Enhanced Measurement fires one scroll event when a visitor reaches 90% of the page. No intermediate data (25%, 50%, 75%). No average scroll depth. To get granular scroll tracking, you need Google Tag Manager, a separate product that requires its own setup, triggers, tags, and custom dimension registration. Most GA4 users never set this up.

Why is my GA4 bounce rate so much lower than Universal Analytics?

Because the definition changed. UA counted any single-page session as a bounce, regardless of time spent. GA4 only counts sessions under 10 seconds with no second pageview and no conversion as bounces. For content-heavy sites like blogs, this can reduce the reported bounce rate by 40-60+ percentage points. The content did not improve. The formula changed.

What does "engaged session" mean in GA4?

An engaged session is one that meets any of three criteria: lasted longer than 10 seconds, included 2 or more pageviews, or triggered at least one key event (conversion). The events first_visit, first_open, and session_start are excluded even if marked as key events. A session only needs to meet one criterion to be considered engaged.

GA4 ENGAGEMENT REALITY CHECK
Enter your GA4 metrics to see what they actually mean.
58%
1:30
What this actually means:
David Karpik

David Karpik

Founder of Clickport Analytics
Building privacy-focused analytics for website owners who respect their visitors.

Comments

Loading comments...

Leave a comment