Clickport
Start free trial

Beyond Pageviews: The Engagement Metrics That Actually Matter

You're looking at your analytics dashboard. Visitors: up 23%. Pageviews: up 31%. Revenue: flat. Leads: flat. Nothing is moving.

This is the most common problem in web analytics, and it has a name: the post-click blind spot. The entire industry is optimized for getting people to click. Almost nobody measures what happens after.

I'm David, founder of Clickport Analytics. I've spent two years building a tool that tracks what visitors actually do on your site: how far they scroll, how long they stay, and what they click. And the more I dig into the data, the more convinced I am that most site owners are measuring the wrong things entirely.

This article is everything I've found. Not surface-level advice about adding internal links, but the real picture: how Google secretly grades your content on engagement, why your analytics tool is lying to you, what AI search changes about all of this, and the specific metrics that actually tell you whether your content is working.

The post-click blind spot

Here is the most uncomfortable finding in web analytics: 53% of all visitors bounce after viewing a single page. Contentsquare analyzed 90 billion sessions across 6,000 websites and found that more than half of your "visitors" never make it past the first screen. Across their publisher network in Q1 2025, average engaged time was just 28 seconds.

It's getting worse, not better. Contentsquare's 2026 benchmark shows overall engagement fell 10% year-over-year, with fewer pages per session and less scroll. And 40% of all online visits are plagued by user frustration that leads to abandoned sessions.

This means two things your pageview counter will never tell you:

  1. Most of your "traffic" never engages with your content
  2. The problem is accelerating as attention fragments across more channels and AI answers

Meanwhile, companies are spending more than ever to drive traffic that converts less. Contentsquare's 2025 benchmark report (90 billion sessions across 6,000 websites) found that brands increased digital ad spend by 13.2% while conversion rates dropped 6.1%. Cost per visit surged 9% in a single year. Paid social traffic specifically drove 9.2% higher bounce rates and 10.6% lower conversions compared to organic.

The traffic number went up. The business results went down. And most dashboards showed only the first half of that story.

The traffic-outcome gap
What your dashboard shows
+23% visitors
+31% pageviews
+18% ad spend
What actually happened
0% leads
0% revenue
-6.1% conversion rate
Source: Contentsquare 2025 Digital Experience Benchmarks (90B sessions, 6,000 websites)

As Avinash Kaushik, Google's former analytics evangelist, put it: web analytics tools "incentivize one-night stands rather than engagements matching customer intent," leading site owners to "insanely expect all visitors to convert right away."

The fix is not more traffic. It's understanding what your traffic actually does. (If you want the full breakdown on how different tools define bounce rate, I wrote a deep dive on bounce rate measurement that covers the three different calculation methods.)

Google is grading you on a test you can't see

Here is something Google does not want you to know: they measure how satisfied users are with your page, and they use it to rank you. They have been doing this since 2005. And they spent years publicly denying it.

The system is called NavBoost. It was revealed during the U.S. Department of Justice antitrust trial in October 2023, when Google VP of Search Pandu Nayak testified under oath that "NavBoost is one of the important signals that we have."

NavBoost tracks three types of clicks from Chrome browser data:

The system stores 13 months of aggregated click data and feeds it directly into ranking.

What makes this remarkable is the gap between what Google said publicly and what came out in court. Gary Illyes, a Google Search Relations lead, called dwell time and CTR "generally made up crap." John Mueller said pogo-sticking is not a ranking signal.

But under oath, former Google Distinguished Engineer Eric Lehman testified: "Pretty much everyone knows we're using clicks in rankings." He also admitted: "We try to avoid confirming that we use user data in the ranking of search results."

Then in May 2024, a massive leak of Google's internal API documentation confirmed the specific mechanisms: NavBoost fields including goodClicks, badClicks, lastLongestClicks, Chrome traffic data (chromeInTotal), and a siteAuthority score.

What Google said vs. what Google does
P
PUBLIC STATEMENT
"Dwell time, CTR... generally made up crap." - Gary Illyes, Google
U
UNDER OATH (DOJ TRIAL, 2023)
"Pretty much everyone knows we're using clicks in rankings." - Eric Lehman, former Google Distinguished Engineer
U
UNDER OATH (DOJ TRIAL, 2023)
"NavBoost is one of the important signals that we have." - Pandu Nayak, VP of Google Search
Sources: Search Engine Land (Lehman), Search Engine Land (Nayak), API Leak Coverage

Google also revealed their primary quality metric during the trial: the Information Satisfaction (IS) score. It is computed from human raters who evaluate whether search results satisfy user intent, and it trains machine learning systems like RankBrain and BERT.

The bottom line: Google is ranking you based on whether visitors are satisfied with your page. They can see this through Chrome data and search behavior signals. You cannot see any of this in your own analytics unless you track engagement.

You are being graded on user satisfaction, but most analytics dashboards only show you the attendance sheet.

Your content ranks but doesn't work

Here is a stat that should change how you think about content: 96.55% of all pages indexed by Google get zero organic traffic. Ahrefs found this across a study of nearly 14 billion web pages. Only 3.45% of published content gets any search traffic at all.

But the opposite problem is just as common and harder to spot: content that ranks fine but doesn't do anything for the business. Traffic is up, rankings are stable, conversions are flat.

The SEO industry has a name for this now: content-market fit. Borrowed from the startup world's "product-market fit," it asks a deceptively simple question: does your content actually match what the searcher needs? Not what they typed, but what they need.

A page can rank for a keyword and still fail to match intent. KP Playbook documented a case where a client's blog post had massive organic traffic but only a 27% engagement rate. It was ranking for competitor pricing keywords. The visitors were looking for a competitor's prices and leaving instantly. The traffic looked great. It was useless.

Google's March 2024 core update explicitly targeted this problem, absorbing the Helpful Content System into core ranking. The update's stated goal: reduce "low-quality, unoriginal content" by 40%. In practice, 837 sites were completely deindexed, wiping out 20 million monthly organic visits.

The pattern in the data is clear. Sites that were pruning thin content and investing in depth survived. Sites running on volume did not.

Backlinko documented a case where two articles competing for the same keyword were consolidated via 301 redirect, resulting in a 466% increase in clicks. A vehicle valuation platform deleted 4.86 million pages (down to 1,500) and saw a 160% increase in organic visits and 105% increase in conversions.

Less content, better matched to intent, measured by engagement. That is what works.

Content-market fit: why traffic and conversions diverge
High traffic, wrong intent
Blog post ranks for "competitor pricing"
Organic visitors 12,400/mo
Engagement rate 27%
Avg. scroll depth 18%
Conversions 3
Lower traffic, matched intent
Comparison guide for your actual audience
Organic visitors 1,800/mo
Engagement rate 71%
Avg. scroll depth 64%
Conversions 47
Illustrative example based on real patterns from KP Playbook and Grow and Convert case studies

Grow and Convert's "Pain Point SEO" approach provides hard numbers on this. By targeting bottom-of-funnel, high-intent keywords instead of high-volume informational ones, they measured a 4.78% conversion rate compared to 0.19% for top-of-funnel content. That is a 2,400% difference. Their clients see 150+ demo requests per month from this approach, with 80% of pages ranking on page one.

The lesson: a page with 1,800 visitors that converts 47 of them is worth more than a page with 12,000 visitors that converts 3. You only see this if you measure engagement.

Everything above was about Google. Now add a second search engine to the picture: AI.

ChatGPT, Perplexity, Google's AI Overviews, and Claude are all answering questions that used to drive clicks to your site. And they choose sources very differently from traditional search.

Kevin Indig's research found that brand popularity was the single greatest predictor for mentions in ChatGPT. Sites with 32,000+ referring domains are 3.5x more likely to be cited than those with under 200. But here's the nuance: 85% of brand mentions in LLMs originate from third-party pages, not your own website. What others say about you matters more than what you say about yourself.

Content structure matters enormously. Semrush's AI Overviews study found that AI-cited articles cover 62% more facts than non-cited ones. Pages with 15 or more recognized entities show 4.8x higher citation probability. Structured content with schema markup and sequential headings correlates with 2.8x more citations.

And there's a critical finding about where in your content the citation comes from: 44.2% of LLM citations pull from the first 30% of the article. 31.1% from the middle. 24.7% from the conclusion. If your key answer is buried at the bottom, AI will never cite it.

Meanwhile, the gap between traditional rankings and AI citations is widening. In July 2025, 76% of AI Overview citations came from the top-10 organic results. By February 2026, that overlap dropped to roughly 17-38%. Nearly half of AI Overview content now comes from pages ranking below position 5.

What makes content citable by AI
Factual density
+62% more facts
Entity coverage
4.8x at 15+ entities
Schema + headings
2.8x more citations
Front-loaded answers
44% from first 30%
Updated in 6 months
53% of cited content
Sources: Semrush, Koanthic, Semrush AI SEO

This shift has massive implications for how you think about content. Traditional SEO rewarded pages that ranked for a keyword, regardless of depth. AI search rewards content that deeply satisfies a query. If human visitors don't stay and engage with your content, AI won't reference it either.

And the zero-click problem is accelerating. 60% of Google searches now end without a click to any website. On mobile: 77%. When AI Overviews appear, the zero-click rate jumps to 83%. Organic click-through rates fell to 40.3% in the US between March 2024 and March 2025.

As Rand Fishkin argued: "In a zero-click world, traffic is a terrible goal." The question is no longer "how many people can we get to our site?" It's "when someone does arrive, did we answer their question so well that both humans and AI remember us?"

The metrics that actually matter

If pageviews and sessions are not enough, what should you measure? Here are the specific engagement signals that tell you whether your content is matching intent.

Scroll depth reveals whether people see your content at all. Nielsen Norman Group's eyetracking research found that 57% of page viewing time is spent above the fold, and 74% within the first two screenfuls. In an earlier study, 75% of users didn't even realize they could scroll. For long-form content (2,000+ words), 60-80% average scroll depth is strong. Below 25% signals a serious problem.

Engaged time shows whether visitors actually read or just landed and left. The cross-industry average is 54 seconds per pageview, but this varies hugely by content type. Blog posts that hold readers for 3-5 minutes are performing well. For top-10 Google results, average dwell time is 3 minutes 10 seconds. The key insight from Chartbeat: if you hold a visitor for 3 minutes, they are 2x as likely to return compared to holding them for 1 minute.

Return visit rate is the strongest signal of content-market fit. Chartbeat found that visitors who view 2 pages are 2.75x more likely to return than those viewing 1 page. Return rate jumps from 8% (1 page) to 22% (2 pages). At 11+ pages, return rate reaches 74%. In e-commerce, returning visitors convert 3-5x higher than new visitors and spend 20% more per order.

Session depth (pages per session) indicates whether visitors are exploring or bouncing. Each additional page viewed correlates with an 18% increase in session duration. Organic search drives the deepest sessions. Paid social has the lowest browsing depth with 8.7% fewer pageviews per session.

Form and click interactions reveal micro-conversions that raw traffic metrics miss. The average form abandonment rate is 67%. Pages with 5 or fewer form fields convert 120% better than those with more. Tracking outbound link clicks, copy-text events, and download actions gives you a much richer picture of visitor intent than pageviews alone.

The engagement scorecard: benchmarks that matter
Metric
Problem
Okay
Strong
Scroll depth (long-form)
<25%
25-55%
60-80%+
Engaged time (blog)
<30s
30s-2m
3-5m+
Return visit rate
<8%
8-20%
20%+
Pages per session
1.0
1.5-2.5
3+
Engagement rate
<40%
40-60%
60%+
Sources: NNGroup, Chartbeat, Arvo Digital, Focus Digital

This is what it looks like in practice. Here's a Clickport dashboard showing all six KPIs at a glance for a real site. No setup, no custom events, no Tag Manager. Every metric tracked automatically from the first pageview.

Clickport KPI bar Last 30 days
Visitors
2,847
+12%
Pageviews
8,103
+8%
Views/Visit
2.85
+4%
Bounce
32%
-6%
Duration
2:34
+18%
Scroll
58%
+9%
Bounce, Duration, and Scroll are tracked automatically. No Tag Manager, no custom events, no configuration. See how engagement tracking works

A word of caution. Avinash Kaushik made the sharpest criticism of engagement metrics years ago: "Engagement is not a metric, it's an excuse." His point was that using "engagement" as a vague umbrella term lets teams avoid the hard work of tying metrics to business outcomes.

He's right. Scroll depth by itself is meaningless. Scroll depth combined with conversion rate by content type tells you which articles drive business results. The individual metrics above only work when you combine them and connect them to outcomes. A dashboard full of green engagement numbers that doesn't lead to revenue is just a different flavor of vanity metric.

Your analytics are lying to you

Even if you're tracking the right metrics, there's a good chance your numbers are wrong.

Plausible Analytics ran a comparison study and found that GA4 underreports engagement time by an average of 54.7%, with some pages showing 80% underreporting. The reason: GA4 excludes bounced visits from time calculations and handles tab-switching differently from tools that measure active focus time.

But the bigger problem is not how GA4 calculates. It's who GA4 sees.

In the EU, GDPR requires cookie consent before Google Analytics can track a visitor. (Eight countries have ruled GA4 illegal outright.) When accept and reject buttons are equally prominent (as the law requires), 60-70% of users in Germany and France reject. A 2024 Advance Metrics study found that even neutrally designed banners see a 34% rejection rate.

The UK's Information Commissioner's Office (ICO), the country's own privacy regulator, lost 90.8% of its tracked traffic after implementing its own best-practice consent banner. From 119,417 daily tracked users down to 10,967.

On top of consent, ad blockers remove another 15-30% of GA4 data. Safari's Intelligent Tracking Prevention caps cookie lifespan to 7 days. Stack these together and the numbers are stark.

GA4 data loss on an EU website: where your visitors disappear
Real visitors
10,000
- Cookie rejected
5,500
~45% reject consent in EU (average across compliant banners)
- Ad blocked
4,200
~24% of remaining visitors use ad blockers that block GA
- Safari ITP
3,700
~12% of remaining have degraded tracking from 7-day cookie cap
GA4 sees
~3,700 of 10,000 (37%)
Sources: CookieYes, Usermaven, Stape, Plausible

And there's a subtler problem: consent bias. The users who accept cookies are not a random sample. They tend to be more engaged, more trusting, or from different demographics than those who decline. This means GA4 data doesn't just have missing volume. It has systematic distortion. Engagement rates appear inflated. Bounce rates look artificially low. Your content performance data is skewed toward an unrepresentative subset of your audience.

GA4 Consent Mode v2 attempts to fix this with behavioral modeling, but it requires 1,000+ daily events from both consenting and non-consenting users for 7 consecutive days before modeling kicks in. Most small and medium sites never qualify. Even when it works, it recovers roughly 70% of lost attribution paths. The rest is misattributed to Direct traffic or lost entirely.

The alternative is straightforward: analytics tools that don't use cookies and don't collect personal data don't need consent. France's CNIL has certified 18 analytics solutions as exempt from cookie consent requirements. These tools see 100% of visitors regardless of privacy preferences, browser settings, or consent banner interaction. No sampling, no modeling, no gaps. The engagement data you get from 100% of your traffic is fundamentally different from the data you get from 37%. (For a broader look at why GA4 is becoming indefensible in 2026, I wrote a separate piece covering the legal, accuracy, and cost issues.)

Here's what individual session data looks like in Clickport. Every visitor is tracked, no cookie consent required, and you can drill into exactly what each person did on your site.

Clickport session drilldown No cookies needed
13.3. 14:22 - 14:29 · Google · DE · Desktop · Chrome
Duration 6:47
Scroll 91%
Pages 4
/blog/privacy-guide3:1291%
/pricing1:5188%
/how-it-works1:0472%
/register0:4045%
13.3. 09:14 - 09:14 · Facebook · US · Mobile · Safari
Duration 0:06
Scroll 8%
Pages 1
/blog/privacy-guide0:068%
Same page, two visitors, completely different intent. The Google visitor read the full article and signed up. The Facebook visitor bounced in 6 seconds. Traffic metrics count both as "1 visit." Learn about session tracking

What good looks like

Theory is useful. Real numbers are better. Here are companies and creators that shifted from traffic metrics to engagement metrics and what happened.

Wil Reynolds, founder of Seer Interactive, reported at SEO Week 2025 that his company saw a 41% decrease in organic traffic in 2024. That sounds like a disaster. But simultaneously they saw a 65% increase in newsletter signups and no negative impact on revenue or leads. Social media generated 89% less traffic than search but produced a 20% increase in leads and 125 more conversions.

His takeaway: "Take the stairs. Do the hard, human work of building trust, telling better stories, and creating content you'd actually stand behind in front of your peers." Traffic dropped. The business grew. Because the traffic they lost was not the traffic that mattered.

Netflix moved their core content metric from counting plays to cumulative hours watched. They discovered that 80% of streaming comes from homepage recommendations, not search. This shift from "did someone click" to "did someone stay" became the foundation of their personalization engine and their primary growth driver.

Grow and Convert built their entire agency model around the insight that bottom-of-funnel content converts at 4.78% versus 0.19% for top-of-funnel. They deliberately target lower-volume, higher-intent keywords and measure leads, not traffic. Their clients see 150+ demo requests per month with 80% of pages ranking on page one.

The common pattern: all three stopped optimizing for reach and started optimizing for resonance. They measured whether content drove the outcome, not whether it drove a click.

Traffic-first vs. engagement-first: how the dashboard changes
Traffic-first dashboard
Visitors 48,291
Pageviews 124,830
Bounce rate 52%
Session duration 1:47
Looks healthy. But which content works? Which visitors matter? No way to tell.
Engagement-first dashboard
Visitors (100% tracked) 48,291
Avg. scroll depth 62%
Avg. engaged time 2:41
Return visitors 22%
Conversions from engaged 312
Visitors are reading deep, staying long, and coming back. Content is working.
The numbers above are the same site, same month. The difference is what you choose to measure.

Orbit Media's Content Performance Matrix provides a useful framework for acting on this data. Plot every piece of content on two axes: traffic and engagement. You end up with four quadrants:

Most content strategies focus exclusively on the top half (traffic). The engagement axis is where the real signal lives.

How to start measuring answers today

You don't need to overhaul your entire analytics setup overnight. Here are the specific changes that have the highest impact.

Track scroll milestones. GA4 only auto-tracks 90% scroll depth, which is almost useless (it tells you who read to the very bottom, not what happened in between). Set up events at 25%, 50%, and 75% thresholds. This requires custom configuration through Google Tag Manager or switching to a tool that tracks scroll depth automatically.

Set a meaningful time threshold. GA4's default "engaged session" threshold is 10 seconds. That is absurdly low for content sites. If your average blog post is 2,000 words, 10 seconds means someone read about 50 words. Consider raising the threshold or switching to a tool that measures active focus time (pausing when the tab is in the background) rather than raw elapsed time.

Watch for return visits. This is one of the hardest metrics to get right. Cookie-based tools lose accuracy because cookies expire, browsers block them, and people switch devices. Privacy-first tools that rotate identifiers daily can't track cross-day returns at all. The tradeoff is real: you get complete same-day accuracy or cross-day tracking, but not both without cookies. If your tool does track return visitors, pay attention to the trend. If your return rate is climbing, your content is building an audience.

Classify your content. Use the pursue/promote/polish/pass framework. Pull your top 20 pages by traffic. For each one, check the engagement data: scroll depth, time, and conversion rate. You will almost certainly find pages with thousands of visits and near-zero engagement. Those are your biggest opportunities.

Compare traffic sources by engagement, not volume. The paid social channel driving 5,000 visitors with 12% engagement and 0.3% conversion is worth less than the organic channel driving 800 visitors with 68% engagement and 4.2% conversion. This comparison is invisible if you only look at traffic.

The tools exist to measure all of this today. GA4 can do most of it with custom configuration. Microsoft Clarity offers free session recordings and a basic intent classification. Privacy-friendly tools like Clickport track scroll depth, engaged time, outbound clicks, form interactions, copy detection, and downloads out of the box, without cookies and without consent banners. You can also set up custom goals to track specific conversion actions tied to engagement thresholds.

Here's what this looks like in practice. Clickport's Pages panel shows engagement data alongside traffic for every page, so you can instantly spot the pursue/promote/polish/pass pattern.

Clickport Pages panel Engagement per page
Page
Visitors
Bounce
Duration
Scroll
/blog/privacy-guide
1,204
24%
4:12
71%
/features
892
41%
1:38
64%
/blog/seo-tools-list
3,417
73%
0:22
14%
/blog/analytics-setup
186
19%
5:47
82%
Row 1: Pursue (high traffic, high engagement). Row 3: Polish (3,417 visitors but 73% bounce, 14% scroll, wrong intent). Row 4: Promote (low traffic but deep engagement, needs distribution). Explore all panels

The shift is not about buying new software. It's about asking a different question. Instead of "how many people came to my site?" ask "how many people got what they came for?"

The first question gives you a number. The second gives you a business.

Summary

The analytics industry spent 20 years optimizing for clicks. 53% of visitors bounce after a single page. 96.55% of published content gets zero traffic. 60% of searches end without a click at all. The traffic game is not just noisy. It is structurally broken.

Google is already measuring what matters. NavBoost tracks whether visitors are satisfied with your page. The IS score evaluates whether your content answers the question. AI search engines cite content based on depth, not rank. The platforms have moved on. Most dashboards haven't.

The fix is measurable and specific. Track scroll depth to see if people read. Track engaged time to see if they stayed. Track return visits to see if they came back. Track conversions per content piece to see what drives business results. Compare traffic sources by engagement quality, not volume. And make sure your analytics tool actually sees 100% of your visitors, not the 37% who happened to click "accept" on a cookie banner.

Stop counting visitors. Start measuring answers.

If you want to see what your engagement data looks like from 100% of your traffic, try Clickport free for 30 days. No cookies, no consent banners, no credit card required. See how it works or check out the full comparison with Google Analytics.

David Karpik

David Karpik

Founder of Clickport Analytics
Building privacy-focused analytics for website owners who respect their visitors.

Comments

Loading comments...

Leave a comment