←back to thread

646 points blendergeek | 1 comments | | HN request time: 0.213s | source
Show context
taikahessu ◴[] No.42726337[source]
We had our non-profit website drained out of bandwidth and site closed temporarily (!!) from our hosting deal because of Amazon bot aggressively crawling like ?page=21454 ... etc.

Gladly Siteground restored our site without any repercussions as it was not our fault. Added Amazon bot into robots.txt after that one.

Don't like how things are right now. Is a tarpit the solution? Or better laws? Would they stop the chinese bots? Should they even? I don't know.

replies(4): >>42726365 #>>42735381 #>>42740706 #>>42743952 #
1. mrweasel ◴[] No.42735381[source]
> We had our non-profit website drained out of bandwidth

There is a number of sites which are having issues with scrapers (AI and others) generating so much traffic that transit providers are informing them that their fees will go up with the next contract renewal, if the traffic is not reduced. It's just very hard for the individual sites to do much about it, as most of the traffic stems from AWS, GCP or Azure IP ranges.

It is a problem and the AI companies do not care.