Traffic Filtering

Our service inspects web clicks coming to your landing page or affiliate offer and classifies each as innocent or malicious. Innocent traffic is then let through to the actual content while malicious visitors are shown a different page that does not have any sensitive content that could be compromised on exposure.

We provide solid protection against a wide range of unwanted visitors: click fraud, ad network moderators, web scrapers, antivirus bots, etc.


In order to detect hostile traffic we need data about each visitor. Most solutions on the market rely primarily on simplistic blacklists of IP addresses, HTTP headers, and other superficial features.

Our approach is smarter than that: we collect thousands of in-depth facts about your visitors in the network, HTTP, and JavaScript contexts, compiling what is known as browser fingerprints. These fingerprints are evaluated by dozens of high-precision scanners, resulting in a confident verdict.

VLA™ Machine Learning

Even the most precise checks are limited in scope. Emerging threats cannot be reliably detected with heuristics tailored for previous generations. Even the smartest analyst may overlook a hidden pattern in fingerprints. But not the slightest deviation will ever escape the scrutiny of a machine programmed to seek for fingerprint anomalies.

VLA™ is our state of the art machine learning technology that can do what our competitors cannot–detect and automatically adapt to new, previously unknown threats as the race of arms in affiliate marketing goes on. The system becomes smarter and more comprehensive with every click we inspect.


Besides traffic filtering, we also collect vast amounts of statistical data, some of which is exposed as part of our built-in ad tracker. This data alone contains a lot of insights about recurring patterns in traffic that can help us identify malicious visitors. But, as with any big data analysis, finding patterns in billions of clicks in real time is a challenging task.

Thankfully, computer science has solutions. HyperLogLog is an advanced algorithm used to estimate cardinality of large sets. It is used in the eponymous state of the art filter that we invented to perform pattern-based filtering in real time based on our entire operation history.

More Exhaustive
Feature List

  • 1.6+ billion IPv4 addresses in main blocklist
  • 2.3+ billion IPv4 addresses in paranoid blocklist
  • Full IPv6 support: 19000+ networks in IPv6 blocklist
  • Full CDN support, including Cloudflare as proxy
  • Cloaking of ad network bots and moderators
  • Cloaking of manual campaign reviews
  • Cloaking of antivirus bots, including Google Safe Browsing
  • Detection of click fraud (fake clicks)
  • JavaScript fingerprinting
  • Multilayer (TCP/IP and SSL/TLS) fingerprinting
  • VPN, residential and mobile proxy detection
  • Manual filters by country, OS, browser, and time zone
  • Manual filters by user agent and referrer regular expressions
  • Manual IP/ASN blacklists and whitelists
  • Automatic IP blacklist population in review mode
  • Automatic A/B testing of up to 254 money pages
  • Automatic timer-based rotation of up to 254 money pages
  • Flexible rules for URL parameter filtering and manipulation
  • Full-featured ad tracker with all major marketing metrics
  • Detailed per-click reporting
  • Detailed aggregate reporting with flexible funnel builder
  • REST API for traffic streams management

Ready to try Adspect
or want to learn more?