Search crawlers have the goal of directing people towards the websites they crawl. They have a symbiotic relationship, so they put in (some) effort not to blow websites out of the water with their crawling, because a website that's offline is useless for your search index.
AI crawlers don't care about directing people towards websites. They intend to replace websites, and are only interested in copying whatever information is on them. They are greedy crawlers that would only benefit from knocking a website offline after they're done, because then the competition can't crawl the same website.
The goals are different, so the crawlers behave differently, and websites need to deal with them differently. In my opinion the best approach is to ban any crawler that's not directly attached to a search engine through robots.txt, and to use offensive techniques to take out sites that ignore your preferences. Anything from randomly generated text to straight up ZIP bombs is fair game when it comes to malicious crawlers.