Overview
Bots account for a significant portion of web traffic. While some bots are beneficial (search engine crawlers), others are malicious: scrapers, credential stuffers, inventory hoarders, and DDoS agents. Detecting bots requires a layered approach combining server-side and client-side signals.
Client-Side Detection Techniques
WebDriver API
The simplest check: navigator.webdriver returns true when the browser is controlled by automation tools like Selenium, Puppeteer, or Playwright. Modern bots try to delete this property, but detection is still possible by checking for residual artifacts.
Chrome Object Inspection
Real Chrome browsers have a window.chrome object with specific properties (runtime, loadTimes). Headless Chrome often has a missing or incomplete chrome object. Testing for chrome.csi and chrome.loadTimes can reveal automation.
Plugin and Permission Anomalies
Real browsers report plugins via navigator.plugins. Headless browsers typically report zero plugins. Similarly, permission API responses differ — headless environments often return “denied” for everything, while real browsers show a mix.
Browser Feature Consistency
Bots often fail to fully implement all browser APIs. Checks include: media device enumeration, notification API support, screen dimensions matching viewport, and proper event handling. An inconsistency between claimed capabilities and actual behavior indicates automation.
Server-Side Detection Techniques
TLS Fingerprint Mismatch
When a bot claims to be Chrome via User-Agent but uses Python's requests library (which uses OpenSSL instead of BoringSSL), the JA4 TLS fingerprint won't match. This is one of the strongest server-side signals because it cannot be fixed without re-implementing the TLS stack.
Header Order Analysis
Real browsers send HTTP headers in a consistent order specific to each browser engine. Bots using HTTP libraries often have different header orderings or include headers that real browsers don't send.
TCP/IP Fingerprinting
If the UA claims Windows but the TCP stack reveals Linux (common when bots run on cloud servers), the OS mismatch is a strong bot indicator. See our TCP/IP Fingerprinting guide for details.
HTTP/2 Settings Fingerprinting
Browser HTTP/2 implementations send specific SETTINGS frames with values for HEADER_TABLE_SIZE, MAX_CONCURRENT_STREAMS, INITIAL_WINDOW_SIZE, and MAX_HEADER_LIST_SIZE. These values differ between Chrome, Firefox, Safari, and common HTTP libraries.
Combined Bot Scoring
No single signal is definitive. PrivKit combines 15+ indicators into a weighted bot score:
- Client-side signals are collected via JavaScript
- Server-side signals are extracted from the connection itself
- Weights account for signal reliability and false positive rates
- The final score (0-100) maps to a classification: human, low automation, suspicious, or bot
Test Bot Detection
Visit our Bot Detection page to see how your browser performs against our detection suite. Try it in both a normal browser and a headless/automated one to see the difference.