←back to thread

646 points blendergeek | 1 comments | | HN request time: 0.322s | source
Show context
Havoc ◴[] No.42736207[source]
What blows my mind is that this is functionally a solved problem.

The big search crawlers have been around for years & manage to mostly avoid nuking sites into oblivion. Then AI gang shows up - supposedly smartest guys around - and suddenly we're re-inventing the wheel on crawling and causing carnage in the process.

replies(2): >>42736252 #>>42737200 #
jeroenhd ◴[] No.42736252[source]
Search crawlers have the goal of directing people towards the websites they crawl. They have a symbiotic relationship, so they put in (some) effort not to blow websites out of the water with their crawling, because a website that's offline is useless for your search index.

AI crawlers don't care about directing people towards websites. They intend to replace websites, and are only interested in copying whatever information is on them. They are greedy crawlers that would only benefit from knocking a website offline after they're done, because then the competition can't crawl the same website.

The goals are different, so the crawlers behave differently, and websites need to deal with them differently. In my opinion the best approach is to ban any crawler that's not directly attached to a search engine through robots.txt, and to use offensive techniques to take out sites that ignore your preferences. Anything from randomly generated text to straight up ZIP bombs is fair game when it comes to malicious crawlers.

replies(2): >>42737269 #>>42742991 #
1. dmix ◴[] No.42742991[source]
FWIW when I research stuff through chatgpt I click on the source links all the time. It usually only summarizes stuff. For ex: if you're shopping for a certain product it wont bring you to the store page where all the reviews are. It will just make a top ten list type thing quickly.