Put this JS file in your website. That's it.
Anticrawl makes your website hostile to automated data extraction — whether that's AI training crawlers, XPath scrapers, Puppeteer/Playwright bots, or anything else trying to systematically pull data out of your pages.
It runs entirely client-side. No server changes. No configuration needed.
Drop the <script> tag in and your content becomes a moving target.
DOM Obfuscation
Wraps text nodes in decoy spans injected with zero-width characters and homoglyphs. Visually identical — structurally broken for scrapers.
XPath Poisoning
Continuously rotates element attributes and IDs so any recorded XPath or CSS selector expires within seconds.
CSS Trap Layer
Injects off-screen decoy nodes filled with fake data. Naive scrapers that grab everything pick up the noise instead of your real content.
Clipboard Poison
Appends a fingerprinting suffix to anything copied from your site, making lifted content traceable back to the source.
Headless Detection
Detects Puppeteer, Playwright, and Selenium signatures. On detection, swaps in a honeypot DOM full of plausible-looking fake data.
Zero Config
Works out of the box with one script tag. Optionally configure each defense layer individually via a single global object.
anticrawl.js
Standard
The full Anticrawl script with all defense layers. Works on any website — static HTML, React, Vue, whatever.
Drop it in and forget about it.
anticrawl.hall.js
Hall Edition
The Hall Edition — maximum aggression. Contents pending.
This version will include additional techniques beyond the standard build.
Usage
<!-- Add before </body> --> <script src="/anticrawl.js" defer></script>
// Optional: configure before the script loads window.ANTICRAWL = { obfuscate: true, // DOM text obfuscation xpathPoison: true, // rotating IDs / attributes clipboardPoison: true, // fingerprint copied text headlessDetect: true, // honeypot on bot detection mutationInterval: 4000, // ms between XPath rotations noiseChars: true, // inject zero-width chars };