←back to thread

597 points classichasclass | 2 comments | | HN request time: 0.578s | source
Show context
yumraj ◴[] No.45011231[source]
Wouldn't it be better, if there's an easy way, to just feed such bots shit data instead of blocking them. I know it's easier to block and saves compute and bandwidth, but perhaps feeding them shit data at scale would be a much better longer term solution.
replies(3): >>45011266 #>>45011290 #>>45011397 #
1. sotspecatcle ◴[] No.45011290[source]

    if ($http_user_agent ~* "BadBot") {
        limit_rate 1k;
        default_type application/octet-stream;
        proxy_buffering off;
        alias /dev/zero;
        return 200;
    }
replies(1): >>45025352 #
2. Avamander ◴[] No.45025352[source]
I recommend you use gzip_static and serve a zip-bomb instead. Frees up the connection sooner and probably causes bad crawlers to exhaust their resources.