←back to thread

646 points blendergeek | 5 comments | | HN request time: 0.486s | source
1. kerkeslager ◴[] No.42727510[source]
Question: do these bots not respect robots.txt?

I haven't added these scrapers to my robots.txt on the sites I work on yet because I haven't seen any problems. I would run something like this on my own websites, but I can't see selling my clients on running this on their websites.

The websites I run generally have a honeypot page which is linked in the headers and disallowed to everyone in the robots.txt, and if an IP visits that page, they get added to a blocklist which simply drops their connections without response for 24 hours.

replies(3): >>42727689 #>>42727693 #>>42727959 #
2. throw_m239339 ◴[] No.42727689[source]
> Question: do these bots not respect robots.txt?

No they don't, because there is no potential legal liability for not respecting that file in most countries.

3. jonatron ◴[] No.42727693[source]
You haven't seen any problems because you created a solution to the problem!
4. 0xf00ff00f ◴[] No.42727959[source]
> The websites I run generally have a honeypot page which is linked in the headers and disallowed to everyone in the robots.txt, and if an IP visits that page, they get added to a blocklist which simply drops their connections without response for 24 hours.

I love this idea!

replies(1): >>42732436 #
5. griomnib ◴[] No.42732436[source]
Yeah, this is elegant as fuck.