←back to thread

204 points pabs3 | 2 comments | | HN request time: 0s | source
1. rendx ◴[] No.44084753[source]
One option that I not see discussed in the blog post: Collecting user signals locally and using those access patterns (mouse movement, clicks, IP/site browsing history) to discriminate between "standard" site usage and bots; so like a "reCaptcha lite", not trained across many sites but trained specifically on the target.

For a ticket platform like pretix that can be run self-hosted alongside the main site, this should give you enough signals to discriminate between normal users and bots, unless they are specifically targeting that site, or am I mistaken? Even just pure web server access logs may be sufficient on smaller sites so this might work even without JS?

replies(1): >>44084858 #
2. jsnell ◴[] No.44084858[source]
This seems pretty well covered by the post?

Doing any kind of access pattern analysis leaves you with the problem of handling false positives, and your proposal doesn't help with the accessibility problems.

IP addresses aren't a panacea here -- this is a high margin business where the attackers can switch to high cost / high quality proxies.

> unless they are specifically targeting that site

In this case the attackers would very specifically be targeting specific sites (ones selling tickets to events with more demand than supply).