Clickport
Start free trial

Bot protection that actually works

51% of web traffic is bots. Most analytics tools use a single check to filter them. Clickport uses 10 layers, and shows you exactly what it caught.

10-Layer Detection

Every request runs through 10 checks before it reaches your dashboard

Four checks run in the browser before any data is sent. Six more run server-side when the event arrives. If any single layer flags a request, it is blocked or tagged.

This is not a static blocklist. The system combines client-side signals like GPU rendering, browser languages, and execution timing with server-side checks like datacenter IP matching, fingerprint velocity, and interaction analysis.

  • Headless browsers caught by WebDriver detection before they send a single event
  • Distributed scrapers caught by fingerprint velocity: 100+ events from the same browser profile in 10 minutes
  • AI crawlers identified by user agent pattern matching across 79+ known bot signatures
Detection Pipeline
1 WebDriver detection navigator.webdriver
2 Software GPU check SwiftShader / llvmpipe
3 Browser language count languages = 0
4 Execution timing 0ms = automation
5 User agent patterns 79+ signatures
6 Datacenter IP ranges 3,430 CIDR ranges
7 Spam referrer filter 2,322 domains
8 Fingerprint velocity 100+/10min = blocked
9 Interaction analysis scroll + no interaction
10 Behavioral scoring mouse, scroll, input
Client-side (browser) Server-side (API)
AI Bot Tracking

Know exactly which AI bots crawl your site

Clickport identifies and catalogs 57 AI bots across three categories: live retrieval (ChatGPT, Claude, Perplexity searching your site in real time), search indexing (AI-powered search engines building their index), and model training (bots scraping content for training data).

The AI Crawlers section in your Bot Center shows which bots visited, how many requests they made, and when they last crawled. Separate from regular bot filtering so you always have visibility into AI activity.

  • See if GPTBot is training on your content, even if you blocked it in robots.txt
  • Track which AI search engines are indexing your pages for live answers
  • Monitor crawl frequency and volume per bot over time
AI Crawlers 57 bots tracked
Live Retrieval 15 bots
ChatGPT-User ClaudeBot PerplexityBot Meta-ExternalAgent AmazonBot Devin
Search Indexing 15 bots
OAI-SearchBot Claude-SearchBot PerplexityBot Applebot-Extended GoogleOther
Model Training 25 bots
GPTBot Google-Extended FacebookBot Bytespider CCBot cohere-ai
Traffic Quality

A single score that tells you if your traffic is real

The Traffic Quality metric measures the percentage of sessions with zero engagement: no scroll, no clicks, minimal time on page. A low percentage means clean traffic. A high percentage means something is off.

Drill into the breakdown by device type and screen size. Bots that spoof mobile devices tend to cluster on specific resolutions. When you see 80% zero-engagement on one screen size but 12% on others, you have found the problem.

  • Green (<25%): healthy traffic. Most visitors are real.
  • Yellow (25-40%): worth investigating. Check sources and screen sizes.
  • Red (>40%): significant bot activity. Time to dig into sessions.
Traffic Quality 18% zero-engagement
82% of sessions show real engagement (scroll, clicks, or time on page)
By device
Desktop
12%
Mobile
22%
Tablet
45%
Tablet breakdown by screen size
768x1024 82% 810x1080 9% 834x1194 11%
Manual Controls

Flag, delete, or block. You stay in control.

Automated detection catches the majority, but sometimes you spot something the system missed. Open any session, review the journey, and flag it as a bot with one click. Flagged sessions are permanently excluded from all dashboards and metrics, retroactively.

For known bad actors, add their IP to the exclusion list. Future requests from that IP are silently dropped before they ever reach your data. No events, no sessions, no noise.

  • Flag a suspicious session: excluded everywhere, immediately, permanently
  • Delete a session entirely: complete removal from your data
  • IP exclusion list: preventive blocking at the request level
Sessions
🇺🇸
/pricing → /features → /register
Chrome 124 · Desktop · 3:42 · Scroll 86%
🇷🇺
/
Chrome 119 · Desktop · 0:01 · Scroll 0%
🇩🇪
/blog/seo-guide → /blog/tracking
Firefox 126 · Desktop · 5:18 · Scroll 92%
🇨🇳
/wp-login.php
Python-urllib · Desktop · 0:00 · Scroll 0%

Always up to date, always fast.

Blocklist Infrastructure
3,430
Datacenter IP ranges
2,322
Spam referrer domains
79+
Bot UA patterns
5,430
VPN ranges whitelisted
Auto-refreshed weekly. 7-day cache TTL.
Blocklists

Curated and auto-refreshed

Datacenter IPs, spam referrers, and bot signatures are sourced from maintained community lists and updated automatically every week. VPN ranges are whitelisted separately so real users behind VPNs are never blocked.

Performance Impact
Tracker size (gzipped) < 2.2 KB
Signal overhead per event < 300 bytes
Client-side checks < 1 ms
Server-side detection < 0.5 ms
Lighthouse impact None
All detection runs inline with normal event processing. No extra requests, no external dependencies, no performance penalty.
Zero overhead

10 checks, zero performance cost

All bot detection runs inside the existing tracker and API pipeline. No additional scripts, no third-party services, no network requests. Your visitors never notice it.

Explore more features

Engagement
Separate real readers from noise
Scroll depth, time on page, and copy detection. Know who actually read your content.
Sessions
Inspect any visitor's journey
Full session timeline with pages, events, and engagement. Flag or delete suspicious sessions.
Privacy
Privacy is the architecture
No cookies, no consent banners, no personal data. GDPR-compliant from the first request.

See what's really hitting your site

Start free trial Read the docs
No credit card required. Set up in under 2 minutes.