Why Headless Browsers Get Detected: A Technical Breakdown

March 12, 2026By Emirhan YILDIRIM

Why Headless Browsers Get Detected: A Technical Breakdown
howtocenterdiv.com — "Software engineering is more than just centering a div."

Puppeteer gives away its identity in at least 11 different ways right away, even before it loads the first page. Most scraping tutorials don't even mention this, and then they look confused when the same script that performed great on localhost suddenly stops working in production.

People get this wrong: bot detection isn't just one if-statement checking one flag. It's a way to score. Every signal you leak adds weight to a score, and once that score surpasses a threshold, you're done — blocked, smacked with a CAPTCHA, or (the cunning one) discreetly provided bogus data so you don't even realize you've been detected.
LayerWhat It ChecksWhen
TLS Fingerprint (JA3)Cipher suite order, extensionsTCP handshake — before HTTP
HTTP/2 FingerprintFrame settings, header orderFirst request
navigator propertieswebdriver, plugins, languagesJS runtime
Canvas / WebGLRendering entropy, GPU stringJS runtime
Mouse & keyboardMovement patterns, timingBehavioral
IP reputationASN, datacenter rangeDNS / IP layer
Most developers simply think about the navigator layer, which means fixing webdriver and maybe faking the user agent. They don't know that TLS fingerprinting detects them before any JavaScript code runs.
Detection adds to itself and runs at the same time. You won't be blocked if you fail one check. You get barred because the score goes over the limit because of the cumulative weight of a few small failures. You can still be found even if you get around navigator.webdriver. It doesn't matter if you got one item right if your JA3, canvas fingerprint, and plugin list don't all match up.

Signal #1 — navigator.webdriver

js
console.log(navigator.webdriver); // Headless → true (instant detection) // Real browser → undefined
The value itself isn't the only thing that matters. Detectors also look at how configurable the property descriptor is; it's like a separate fingerprint.
js
1// Looks like a fix, still detectable 2Object.defineProperty(navigator, 'webdriver', { get: () => false }); 3 4// What a detector actually sees 5Object.getOwnPropertyDescriptor(navigator, 'webdriver'); 6// → { value: false, writable: true, configurable: true } 7// In real Chrome, the property doesn't even exist — just undefined
A typical mistake is trying to patch properties after the page has loaded, instead of making the changes before the page loads. The clock is continuously moving.

Signal #2 — The plugins Array

js
1navigator.plugins.length; 2// Real Chrome → 3–7 3// Headless → 0 ← one-line detection 4 5navigator.mimeTypes.length; 6// Real Chrome → 2+ 7// Headless → 0
Any detection script can just check whether navigator.plugins.length === 0 and then stop. But putting in some false plugins isn't the only way to address it. The names, descriptions, and mime types of the plugins must all be the same inside the plugin itself, and they must also match the user-agent you declared. That's also an indication if your UA says Chrome 120 on macOS yet your plugin list looks like Chrome on Windows.

Signal #3 — Canvas Fingerprinting

js
1const canvas = document.createElement('canvas'); 2const ctx = canvas.getContext('2d'); 3ctx.font = '11pt "Times New Roman"'; 4ctx.fillText('Cwm fjordbank', 2, 15); 5const fingerprint = canvas.toDataURL(); 6// Real machine → unique hash, varies by hardware 7// Headless Chrome → identical hash, every single time
Headless Chrome always gives the same pixel output when given the same code, no matter what hardware it runs on. There is no GPU variance. Detection systems keep databases of known headless canvas hashes, and yours is one of them.

Signal #4 — WebGL Renderer String

js
const gl = document.createElement('canvas').getContext('webgl'); const info = gl.getExtension('WEBGL_debug_renderer_info'); gl.getParameter(info.UNMASKED_VENDOR_WEBGL); // Real machine → "Intel Inc." / "NVIDIA Corporation" // Headless → "Google SwiftShader" ← banned everywhere
SwiftShader from Google is a software renderer developed for installations without a display. Detection systems everywhere have identified and blacklisted this string. If SwiftShader comes up, you're marked, no matter what else you've done.

Signal #5 — TLS / JA3 Fingerprint

The client sends a ClientHello message as the first stage of a TLS handshake. This message lists the cipher suites, extensions, and elliptic curves it can use. The order of these items is set by the TLS library in use, not by the user-agent string you may have set up.
JA3 = md5(SSLVersion, Ciphers, Extensions, EllipticCurves, ECPointFormats)
ClientTLS LibraryJA3 Hash
Chrome 120 / macOSBoringSSLcd08e31494f9531f560d64c695473da9
Node.js 20 (axios/got)OpenSSLb32309a26951912be7dba376398abc3b
Python requestsPython ssl3b5074b1b5d032e5620f69f9f700ff0e
You can change every navigator property that exists by setting User-Agent: Chrome/120. Before any JavaScript code is run, the TLS handshake still identifies itself as Node.js. There isn't a JS fix for this one.

Signal #6 — HTTP/2 Fingerprint

The options for the HTTP/2 SETTINGS frame are different for actual Chrome and Node's http2 module:
Chrome 120:  HEADER_TABLE_SIZE=65536, ENABLE_PUSH=0, INITIAL_WINDOW_SIZE=6291456
Node.js:     HEADER_TABLE_SIZE=4096,  ENABLE_PUSH=1, INITIAL_WINDOW_SIZE=65535
This is extracted at the load balancer level, before any application logic occurs at all.

Signal #7 — Strange Behavior and Timing

js
// Bot movement mousemove: (100,200) (400,200) (400,500) // flawless L-shapes, right away // Human movement mousemove: (100,200) (138,213) (201,228)... // curved, at a changing pace
Detection systems do more than just track mouse movements. They also keep an eye on how long it takes a person to type (usually between 50 and 200 milliseconds), how they scroll, and how long they stay on a page before they do something. A bot that clicks 80 milliseconds after a page loads is a bot.
Adding Math.random() delays to this doesn't help. A straight-line mouse path with unpredictable timing is still a straight-line mouse path.
Detection systems create entropy scores for whole sessions, not just for one occurrence. Low entropy means automation. High entropy means human. That's why some bots make it past the first test and then get tagged 30 seconds later. The score builds up over time, not all at once.

The Mistake Matrix

MistakeWhy It Fails
Only patching navigator.webdriver10+ signals still leak
Using got/axios with spoofed headersJA3 still says Node.js
No --disable-blink-features=AutomationControlledwindow.chrome exposes automation flag
Datacenter proxies (AWS/GCP/Azure)ASN is blacklisted before the fingerprint check
User-Agent without matching sec-ch-uaHeader contradiction — caught immediately
Math.random() delays onlyTiming variance isn't the same as behavioral entropy

The Risk Score Model

All checks run at the same time, and the scores are added up:
code
1IP reputation: +0.1 (clean, residential) 2JA3 mismatch: +0.6 (Node.js TLS on Chrome UA) 3navigator.webdriver: +0.0 (fixed appropriately) 4Canvas hash: +0.4 (a known headless hash) 5Plugin count: +0.3 (no plugins) 6Mouse entropy: +0.5 (moving in a straight line) 7───────────────────────────────────────────── 8Total: 1.9 → Block threshold: 1.5
You did great on navigator. It doesn't matter — TLS and canvas alone pushed the score over the edge.

The sec-ch-ua Issue

Every request in Chrome 90 and later sends client hints. This is what a real Chrome session looks like:
http
sec-ch-ua: "Chromium";v="120", "Google Chrome";v="120", "Not-A.Brand";v="99" sec-ch-ua-mobile: ?0 sec-ch-ua-platform: "macOS" User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7)...
A Puppeteer session that sets a contemporary user-agent but doesn't provide client hints sends:
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7)...
# sec-ch-ua: absent
It is impossible to have a Chrome 90+ user-agent without a sec-ch-ua. Flagged right away. And just being there isn't enough either — the brand token order and version number have to match the whole UA string.

Tools vs. Layers

ToolFixesDoesn't Fix
puppeteer-extra-plugin-stealthJS-layer signalsTLS/HTTP2 fingerprint
rebrowser-puppeteerCDP leaks, runtime injectionTLS, behavioral
Go + CycleTLSJA3 fingerprintBehavioral, canvas
Real Chrome via CDPTLS, canvas, GPUProxy/IP reputation
A real Chrome browser operating on consumer hardware with a residential IP is the only client that passes all layers by default.
Most guides don't even mention this. They see detection as a list of things to do: check off each one and you're done. But detection systems work like probability models. They don't need to be sure, just sure enough. Even if a setup fixes 9 out of 10 signals, it can still be blocked if the last signal is strong enough. Most scoring methods give a JA3 mismatch a score of 0.5 to 0.7. That one leak could be enough.