How accurate is your GA4? Free data loss estimator.

See how much traffic ad blockers, cookie consent, and GA4's own internal limits are hiding from your reports, plus how much of what's left is likely bots. Based on 2026 published benchmarks, every number is cited.

Published sources per driver 100% client-side Takes 20 seconds
Your site
Three inputs estimate your loss. Advanced mode lets you override regional and device defaults.
1k10k100k1M10M
What GA4 currently reports. The calculator estimates how many more you would see without the blind spots below.
Controls consent rejection baseline. EU markets reject more since post-EDPB banner redesigns.
Controls ad blocker baseline. Technical audiences block ~2x more than general consumer sites.
Cookie banner present Off means no consent banner (most US sites). On means GDPR-style banner.
Override defaults
Mobile users block analytics ~half as often as desktop users. Higher mobile share drops your ad-blocker loss.
Start free trial
You're likely missing 55% of your real traffic.
Ad blockers block ecom baseline 20% + EU-mid modifier 0pt = 20% of visitors directly. uBlock Origin, Brave Shields, and DuckDuckGo block google-analytics.com by default via EasyPrivacy. Source: Backlinko/GWI 2025 (29.5% global, 31-33% in DE/US/PL/AT; independent field experiments have reported up to 58% on technical audiences).
GA4 internal limits drop 5 to 10 percent of what's left: thresholding blanks demographics, the "(other)" row swallows high-cardinality values, "(not set)" attribution loses source/medium, UTM refreshes split sessions. Calculator uses the 7% midpoint. Sources: GA4 data thresholds, GA4 cardinality, GA4 bot filtering.
About 62,000 sessions missing per month. Range 47 - 64%.
Bot pollution in what GA4 does show
12% of your reported GA4 traffic is likely bots that slipped past GA4's IAB filter.
Likely bot-type breakdown
AI training crawlers 15%
Headless browser scrapers 35%
Residential-proxy bots 25%
Referral spam 25%
E-commerce sites see price scrapers, inventory scrapers, and checkout fraud bots more than AI crawlers.

See your real numbers without the gaps

Clickport is visit-first analytics that is not blocked by EasyPrivacy, does not need a cookie banner for most sites, and does not inflate Safari users. 30-day free trial, no credit card.

Start free trial

What this calculator shows

Google Analytics 4 reports a number. That number is the ceiling, not the floor. Between the user's browser and GA4's servers three separate filters delete real traffic (ad blockers, cookie consent rejection, GA4's internal limits), and a fourth layer pollutes what's left (bots GA4's built-in filter misses). This tool estimates how big each gap is for your specific site, using published 2026 benchmarks for each driver.

The hero bar splits your real traffic into four segments: the portion GA4 actually sees plus the three loss layers. Below the hero, the bot-pollution block shows a per-industry breakdown of what GA4 does show that is likely not human. Flip the toggle to With Clickport to see what the same site looks like through a first-party tracker that isn't blocked, doesn't need a consent banner for most deployments, and filters bots with 30+ signals instead of the IAB list.

The number in the hero is a range, not a measurement. Individual site results vary with audience, geography, and implementation. If you want the real number, install Clickport alongside GA4 for a week and compare.

How much data does Google Analytics actually miss?

On a US tech audience with no consent banner, GA4 typically misses 35 to 50 percent of real visitors. On an EU e-commerce site with a compliant banner, 45 to 60 percent. Across all audiences and regions the published range runs from about 15 percent (US lifestyle, no consent banner, general audience) up to 65 percent (German developer audience, strict banner, desktop-heavy mix).

Independent field experiments have measured GA missing up to 58 percent of visitors on technical audiences, where ad blocker installation rates run two to three times higher than the baseline. The calculator uses the same logic but scales it by your industry and region, so a lifestyle site with a general audience lands much lower.

Ad blockers: 18 to 40 percent of traffic, depending on audience

The simplest filter. A visitor installs uBlock Origin, enables Brave Shields, or uses DuckDuckGo, and the GA4 script never loads. The EasyPrivacy filter list, shipped by default in every major blocker, blocks google-analytics.com, googletagmanager.com, and stats.g.doubleclick.net. No script load, no measurement, no data.

Per Backlinko's 2025 ad-blocker report synthesizing GWI data, 29.5 percent of global internet users now use ad blockers. Country rates: Indonesia 40.1 percent, Austria 32.7 percent, US 32.5 percent, Poland 32.5 percent, Germany 31.5 percent, UK 28.5 percent. Desktop users block more than mobile (37 vs 15 percent in US). Technical audiences block 2-3x the baseline.

Server-side Google Tag Manager was supposed to fix this. It doesn't. Per DataUnlocker's 2024 analysis, "nearly 80 percent of widely used blocker software can detect and block server-side GTM," with gains in the single-digit percentages. uBlock Origin added first-party CNAME detection in Firefox in 2020. The filter-list maintainers continue to add common sGTM URL patterns as they appear.

Cookie consent: up to 55 percent loss in strict-EU markets

Second filter. When a visitor in the EU lands on your site and clicks "Reject all" on your cookie banner, Google Consent Mode v2 drops the GA4 hit or sends a cookieless ping with drastically reduced fidelity. The dimension-level user behavior is gone either way.

Didomi's 2026 benchmark shows vendor-optimized banners getting 75 to 87 percent consent in Europe. But that's with banners designed to push the accept button. Academic studies of genuinely neutral EDPB-compliant banners show a very different number: Bielova et al. at USENIX Security measured 34 to 47 percent rejection on transparent designs, and 50 to 60 percent on banners where Accept and Reject are equally prominent. Germany's share of banners with equally-visible reject buttons rose from 27 percent (2023) to 52 percent (2025) per etracker, forced by the EDPB Cookie Banner Taskforce and a wave of DPA enforcement.

Google's counter-argument is Consent Mode v2 behavioral modeling. The reality: per Google's own docs, modeling requires at least 1,000 daily events with analytics_storage='denied' for 7 consecutive days AND at least 1,000 daily users with granted storage for 7 of the past 28 days. Most small and mid-sized sites never qualify. And even when they do, modeled data reconstructs totals, not which page or source or user. At v1 this calculator assumes zero modeled recovery. Conditional recovery logic is a v2 consideration.

Safari ITP: why iOS users look like new users every week

Worth understanding even though the calculator doesn't fold it into the loss number — Safari ITP is distortion, not deletion. Apple's Intelligent Tracking Prevention (ITP) caps certain first-party cookies to 7 days when it detects CNAME cloaking or third-party IPs. Any site using server-side GTM with a custom subdomain like metrics.example.com is affected, unless the IP prefix matches the main domain exactly. So a returning Safari user shows up as a brand-new client_id every 7 days, inflating "new users" and degrading attribution.

Safari now accounts for roughly 18 percent of global browser share per StatCounter, 32 percent on US desktop, and over 50 percent of US mobile. In US-focused sites, roughly 12 percent of GA4 sessions end up over-counted as new users because of this. iCloud Private Relay on top of that coarsens the IP-geolocation on 5 to 15 percent of iOS Safari sessions. Stape's detailed writeup of Safari 16.4's CNAME detection explains the mechanism in detail. The calculator keeps this out of the headline loss number because it's an over-count, not a missing-sessions problem.

GA4 internal limits: thresholding, sampling, and (not set)

Third filter. Even when a visitor makes it through ad blockers and consent rejection, GA4 itself drops or distorts more data internally.

Thresholding blanks demographic and search-term rows when the cohort is too small (no published threshold; practitioners see it below ~50 users per row). Analytics Mania has a good walkthrough.

Cardinality. Google's guidance is 500 unique values per dimension per day. Sites with many product pages regularly see over half of pageviews collapse into a single "(other)" row.

"(not set)" attribution. Consent Mode v2 cookieless pings, UTM-stripping by in-app browsers (Meta, TikTok, LinkedIn), and ITP cookie caps all route to "(not set)". Typical EU SMB sees 15 to 30 percent of sessions with "(not set)" source/medium.

Data retention default is 2 months (max 14 months in standard GA4). Year-over-year comparisons in Explorations are impossible without BigQuery export.

Bot pollution: the other direction

Everything above makes GA4 count too few real visitors. Bot pollution pushes the other way: GA4 counts bots as real visitors because its filter misses most of what arrives at a modern site.

GA4's only built-in bot defense is the IAB/ABC International Spiders and Bots List, a curated roster of commercial crawlers and known scrapers. The filter is always on, no toggle, and no count is exposed. It does a reasonable job on the old commercial bot fleet. It does nothing about the four bot categories that dominate modern traffic: headless Chrome on residential proxies, AI training crawlers (GPTBot, ClaudeBot, PerplexityBot, and the many less-polite ones), scrapers built with Playwright or Puppeteer, and referral spam campaigns that fake direct visits.

The calculator estimates the size and composition of your bot pollution in the Bot block below the hero. The headline percentage is industry-adjusted with a size modifier (small sites get hit proportionally harder because bot volume is roughly fixed while the denominator is not). The per-type breakdown underneath shifts with your industry: publishers and tech blogs see AI training crawlers dominate (50% and 40% of their bot pollution respectively); finance sites see residential-proxy bots lead (35%) because fraud bots mimic real users; e-commerce sites get hit hardest by headless scrapers (price/inventory/checkout fraud); consumer lifestyle sites attract more referral-spam campaigns than technical scraping.

Bot pollution is separate from the loss drivers and not added into the missing-sessions total. It answers a different question: of what GA4 does show you, how much is real?

Why is my GA4 lower than Search Console?

Common symptom of the three loss drivers above. Search Console counts clicks at Google's edge, before any browser-side filtering. GA4 counts sessions after the script has loaded, cookies have been set, and consent has been granted. A typical healthy site sees Search Console clicks 1.3 to 2.5x higher than GA4 organic sessions, with tech sites and EU sites at the top of that range.

Other real causes: Search Console counts clicks, GA4 counts sessions (same user clicks twice and bounces = 2 Search Console clicks, 1 GA4 session); some clicks land on redirected URLs that GA4 doesn't track; attribution-window and timezone mismatches. But if your delta is above 2x, the loss drivers above are doing most of the work.

How to recover this data

Flip the toggle at the top of the calculator from With Google Analytics to With Clickport to see the zero-loss version of the same site. The three options below are the trade-offs of getting there — the first two patch GA4, the third replaces it.

Server-side GTM recovers single-digit to low-double-digit percentages of ad-blocker loss per independent analyses, but does not beat Safari ITP and does not qualify small sites for Consent Mode v2 modeling. Hosting runs 90 to 500 dollars per month on Google Cloud Run. Setup is 50 to 120 engineering hours per published consultancy estimates, plus 10 to 20 dev hours of ongoing maintenance per month.

Consent Mode v2 behavioral modeling requires 1,000 denied plus 1,000 granted events per day for 7 consecutive days. Most small and mid-sized sites never qualify. Accuracy degrades below 20 percent consent rate per Google's own docs.

A first-party analytics tracker that isn't blocked in the first place. Clickport serves the tracker from the same origin as your site, so EasyPrivacy has nothing to block. It counts sessions without third-party cookies, does not need a consent banner for most deployments, and classifies bots with a 30+ signal engine rather than the IAB list. The loss drivers above apply to GA4 because of how GA4 is architected, not because analytics itself has to work that way.

Methodology and sources

Every number the calculator uses comes from a published benchmark. The formula composes the three loss drivers as surviving fractions (not additive percentages), because a session blocked by an ad blocker cannot also be lost to consent rejection. Bot pollution is computed separately as a distortion metric and is not added into the missing-sessions total — it answers a different question (of what GA4 does show you, how much is real?).

Formula

surviving_fraction = (1 - adblock_rate) x (1 - consent_denial_rate) x (1 - internal_gap)
observed_sessions  = actual_sessions x surviving_fraction
lost_sessions      = actual_sessions - observed_sessions
lost_percentage    = lost_sessions / actual_sessions

Worked example

EU e-commerce site, banner on, 60,000 GA4-reported sessions per month:

adblock_rate         = 0.20   (ecom baseline, EU-mid modifier 0pt)
consent_denial_rate  = 0.40   (EU-mid region, banner on)
internal_gap         = 0.07   (mid of 5-10% noise floor)

surviving_fraction   = (1 - 0.20) x (1 - 0.40) x (1 - 0.07)
                     = 0.80 x 0.60 x 0.93
                     = 0.446

observed_sessions    = what GA4 shows you: reported as 60,000
actual_sessions      = observed_sessions / surviving_fraction = 134,500
lost_sessions        = 134,500 - 60,000 = 74,500
lost_percentage      = 74,500 / 134,500 = 55.4%
range (+/- 15%)      ~ 47% - 63%

Bot pollution is computed separately from the loss formula. The headline bot percentage is an industry default (publisher 25% / tech 20% / SaaS 15% / finance 18% / e-commerce 12% / lifestyle 10%) with a size modifier (+5pt for sites under 10k monthly sessions, −5pt for sites over 500k). The per-type breakdown (AI crawlers / headless scrapers / residential-proxy bots / referral spam) is industry-specific and sums to 100% of the bot-pollution slice.

Ad blocker rates

Cookie consent rejection

Safari ITP

GA4 internal limits

Bot pollution

Methodology updated April 2026.

The gap is real. Here's what the unblocked number looks like.

Run Clickport alongside GA4 for 30 days. Same site, same visitors, different ceiling. No cookie banner needed for most sites.

Start free trial

FAQ

Is this calculator free? Do I need an account?

Yes, completely free. No signup, no email, no account. Everything runs in your browser. We do not store the URLs or traffic values you enter.

How accurate is this estimate?

This is an estimate, not a measurement. The calculator produces a range (plus or minus 15% of the midpoint) based on 2026 published benchmarks for ad blocker adoption, cookie consent rejection, GA4 internal limits, and bot-traffic reports from Imperva, Cloudflare, and Akamai. Individual sites will vary with their specific audience, banner design, and GA4 configuration. If you want the real number, install a second analytics tool alongside GA4 and compare for a week.

Does Consent Mode v2 fix this?

Partially, and only for sites over a threshold. Per Google's own docs, behavioral modeling requires at least 1,000 daily denied events plus 1,000 daily granted events for 7 consecutive days. Sites under that threshold get no modeled recovery. Even when modeling kicks in, it reconstructs totals and conversions, not per-page or per-user behavior. This calculator assumes 0% recovery at v1.

Why is my GA4 lower than Search Console?

Search Console counts clicks at Google's edge, before any browser-side filtering. GA4 counts sessions after the tracking script has loaded. A typical delta of 1.3x to 2.5x is normal (click-to-session gap plus redirects plus timezone differences). Anything above 2x is the three loss drivers this calculator models: ad blockers, cookie consent rejection, and GA4's internal limits.

What percentage of visitors block GA4?

Depends entirely on audience. Global baseline: 29.5% per Backlinko/GWI 2025. US: 32.5%. Germany, Austria, Poland: 31-33%. UK: 28.5%. Technical audiences (developers, early adopters) run 40-60%. General-consumer lifestyle sites run closer to 15-20%. This calculator uses your selected industry and region to pick a defensible midpoint.

Does server-side Google Tag Manager fix ad blocker loss?

Not really. Per DataUnlocker's 2024 analysis, "nearly 80% of widely used blocker software can detect and block server-side GTM, even in custom setups," with single-digit percentage gains. uBlock Origin's first-party CNAME detection in Firefox and filter-list maintainers adding sGTM URL patterns have largely closed the loophole. It also doesn't beat Safari ITP or unlock Consent Mode v2 modeling for small sites.

How much of my GA4 traffic is actually bots?

Depends on your industry and site size. Publishers and tech blogs take the heaviest AI-crawler hits, and small sites get hit proportionally harder because the bot volume is roughly fixed. Typical ranges: tech / dev blogs 20-30%, publishers 25-35%, SaaS / B2B 15-20%, e-commerce 8-15%, consumer lifestyle 8-12%. GA4's IAB-list filter catches the old commercial bot fleet but misses headless Chrome, AI crawlers (GPTBot, ClaudeBot, PerplexityBot), residential-proxy scrapers, and referral spam. Clickport's 30+ signal bot detection engine catches these; GA4's does not.

Does Safari block Google Analytics?

Safari doesn't block GA4 outright, but Intelligent Tracking Prevention distorts the data. Since Safari 16.4, first-party cookies from CNAME-cloaked setups are capped at 7 days. A returning visitor becomes a new client_id every week, inflating "new users" and breaking multi-session attribution. Roughly 8-12% of sessions on a typical site are affected by ITP distortion.

Can I trust the result for a blog post, podcast, or pitch deck?

The percentages are defensible estimates, not guarantees. Every source is cited in the methodology section. If you quote the number, link back to the calculator so readers can run their own inputs. Do not claim the number as a precise measurement of your site. If you need precision, install a second analytics tool for direct comparison.
Disclaimer. This estimator produces a range, not a measurement. Figures are derived from published industry benchmarks on ad-blocker adoption, cookie-consent decline rates, bot traffic share, and Google Analytics 4's documented behavior. Sources are listed in the methodology section. Individual site results will vary based on audience, geography, consent banner implementation, and traffic composition. This tool is provided for informational purposes only, without warranty of any kind. Google Analytics and GA4 are trademarks of Google LLC; Clickport is not affiliated with or endorsed by Google. We do not store the URLs or traffic values you enter.