Clickport
Start free trial

Bot Management

Clickport automatically detects and blocks bot traffic before it reaches your analytics data. Every incoming event passes through a multi-layer detection system that identifies crawlers, scrapers, monitoring tools, and other non-human visitors. Blocked events are counted separately and surfaced in the Bot Center dashboard so you can see exactly what is being filtered.

A note on detail. This page describes what the bot detection system does, not how it works internally. Publishing the exact detection logic would make it easier for bad actors to craft bots that bypass filtering. If you have questions about a specific detection method, reach out through the contact page.

How bot detection works

When a tracking event arrives at the API, it passes through multiple detection layers. If any layer identifies the visitor as a bot, the event is blocked immediately and never written to your analytics tables.

Detection layers

  1. Client-side checks: The tracker runs several checks in the browser before sending any data. If it detects signs of browser automation or a non-production environment, it silently exits without sending events.
  2. Browser signal analysis: The tracker collects lightweight fingerprint signals and sends them with each event. The server checks for software GPU renderers (SwiftShader, llvmpipe) used by headless Chrome, and for missing browser language preferences. These catch automated browsers that override navigator.webdriver but forget to configure other properties.
  3. User agent pattern matching: The server matches the request's User-Agent header against a curated list of known bot patterns. This covers search engine crawlers, AI bots, SEO tools, social media preview bots, monitoring services, HTTP libraries, and vulnerability scanners.
  4. Datacenter IP detection: The visitor's IP address is checked against a blocklist of datacenter and cloud provider IP ranges. Real visitors browse from residential or mobile networks, not from AWS or Google Cloud. To avoid false positives, known VPN provider ranges are whitelisted so legitimate users on VPNs are not blocked.
  5. Spam referrer filtering: The referrer URL is checked against a maintained list of thousands of known spam domains.
  6. Viewport and header analysis: Additional server-side heuristics examine request metadata for patterns that indicate non-browser clients.
  7. Fingerprint velocity limiting: The server tracks the combination of browser version, operating system, country, and device type per site in a sliding time window. When a single fingerprint generates an unusually high number of sessions, all subsequent events with that fingerprint are blocked. This catches distributed botnets that rotate IP addresses but share a uniform browser configuration.
  8. JS execution timing: The tracker measures how long its own JavaScript takes to initialize. Real browsers need at least a few milliseconds to parse and execute the script. Headless automation that injects pre-compiled scripts can report zero execution time. Events with impossibly fast timing are blocked.
  9. Interaction-based detection: When a session shows deep scroll depth (90%+), short engagement time (under 5 seconds), and zero human interaction (no clicks, taps, or keypresses), it is blocked as automated browsing. This catches content scrapers that simulate scrolling without interacting with the page.
  10. Behavioral scoring: The tracker computes a real-time score from mouse movement variance, scroll velocity patterns, and input timing distribution. Natural human behavior produces varied, unpredictable patterns, while robotic automation tends to be uniform. This score is sent with engagement data for server-side evaluation.
Bot Protection
Active
120
bots blocked in last 30 days (6.5% of traffic)
+ 22 manually flagged
Detection Breakdown
Bot User Agents 82%
Datacenter IPs 18%
No Viewport 1%

Blocklists

The bot detection system relies on multiple blocklists sourced from established open-source projects. These lists are automatically downloaded, cached on the server, and refreshed periodically. If a remote fetch fails, the previous cached version is used as a fallback.

Blocklist Statistics
Auto-updated from open-source sources
Datacenter IPs
Cloud provider IP ranges
3,430 ranges
VPN Ranges (whitelisted)
Excluded from datacenter blocking
5,400 ranges
Spam Domains
Known referrer spam sources
2,322 domains
Bot UA Patterns
Built-in pattern list
79 patterns

Bot Center dashboard

The Bot Center is a tab inside the Settings panel on the dashboard. It shows a summary of bot blocking activity for the last 30 days and provides visibility into what is being filtered.

What the Bot Center shows

Note: Bot statistics are stored with a 90-day retention period. Older data is automatically cleaned up, so the Bot Center always reflects recent activity.

Traffic Quality

Not all bot traffic can be blocked at ingestion. Sophisticated bots that execute JavaScript, use real browser user agents, and route through residential proxies can bypass detection entirely. This is an industry-wide problem that affects every analytics tool.

The Traffic Quality section in the Bot Center measures the percentage of sessions with zero engagement: no scroll, no clicks, no time on page, and only a single pageview. A color-coded card shows the health of your traffic at a glance.

Below the card, a per-device breakdown shows the zero-engagement rate for Desktop, Mobile, and Tablet separately. Click any device to expand it and see the breakdown by screen size (XS through XXL). This is often the most revealing signal: bots that spoof a device type tend to cluster on a single screen size.

Traffic Quality: Healthy site
18.4%
zero-engagement sessions (2,694 of 14,641)
Desktop 14.2% (189 of 1,331)
Mobile 19.1% (2,412 of 12,638)
XS (<576) 17%
SM (576-767) 21%
Tablet 13.8% (93 of 672)

When traffic quality is healthy, zero-engagement rates are similar across all device types (typically 15-25%) and consistent across screen sizes within each device. This is normal bounce behavior: real visitors who land on a page and leave before engaging.

Traffic Quality: Bot-contaminated site
64.7%
zero-engagement sessions (23,089 of 35,680)
Desktop 42.9% (10,800 of 25,202)
LG (992-1199) 31%
XL (1200-1399) 68%
XXL (1400+) 14%
Mobile 97.6% (10,050 of 10,302)
XS (<576) 98%
SM (576-767) 96%
Tablet 21.9% (48 of 219)

The contrast between device types is the key signal. Here, Desktop has 43% zero-engagement while Mobile has 98%. Tablet remains at a healthy 22%. Expanding Desktop reveals the bots cluster on XL (1200-1399) screens at 68%, while LG and XXL are much lower. This pattern reveals that the mobile traffic is almost entirely non-human, and the desktop bots all spoof a single resolution. Sophisticated bots often spoof mobile device profiles because they are easier to fake than desktop environments.

Traffic Quality is powered by Clickport's engagement tracking (scroll depth, time on page, click detection). These are client-side signals that only a JavaScript-based analytics tool can capture, making this a uniquely useful metric for assessing traffic authenticity.

Manual session flagging

Sometimes a bot slips through automated detection. When you spot a suspicious session in the Sessions panel, you can manually flag it as a bot.

How flagging works

  1. Open the Sessions panel from the Content tab on the dashboard.
  2. Find the suspicious session. Look for signs like: a single pageview with zero scroll depth, very short duration, or a datacenter-like browsing pattern.
  3. Expand the session to see its detail view with page-by-page data.
  4. Click Flag as Bot in the session actions.
Session detail with Flag as Bot action
Entry page /pricing
Duration 0s
Scroll depth 0%
Pageviews 1
Browser Chrome 120
Country US

Flag as Bot Delete Session

When you flag a session as a bot, Clickport marks it internally so that all dashboard queries exclude it. The session's events are also updated to prevent them from appearing in page-level panels. This means the flagged session is effectively removed from all dashboard metrics retroactively.

You can also delete a session entirely using the Delete Session action. This permanently removes both the session record and all its associated events from ClickHouse.

Important: Flagging a session as a bot is permanent. There is no "unflag" action. If you flag a session by mistake, you would need to delete the session data manually. For most cases, flagging is the preferred approach because it preserves the data for auditing while excluding it from your analytics.

IP exclusion

In addition to bot detection, you can exclude specific IP addresses from tracking entirely. This is useful for filtering out your own visits and those of your team. Excluded IPs are checked at ingestion time before any event processing occurs.

IP exclusion is managed in the Dashboard tab of the Settings panel. The interface shows your current IP address and lets you add or remove it with a single click. All excluded IP addresses are stored in your site settings and take effect immediately.

Unlike bot flagging, IP exclusion is preventive. Events from excluded IPs are silently dropped before they enter the analytics pipeline. No data is recorded and no bot statistics are counted for these requests.

Client-side detection

The tracker includes its own first line of defense before any data is sent to the server. It checks for common signs of browser automation, headless environments, and non-production contexts. If any check triggers, the tracker silently exits without sending events.

The tracker also does not fire on localhost or local file URLs, so development and testing environments never generate tracking data.

In addition to the early-exit checks, the tracker collects three signals that are sent to the server for analysis:

How blocked events are recorded

When a bot is detected at the server level, the event is not inserted into your analytics tables. Instead, the detection is counted and categorized so the Bot Center can display aggregate statistics. Blocked events are stored with the detection method, the source that triggered the block (e.g., the bot name or IP provider), and a daily counter. This data is retained for 90 days and cleaned up automatically.