I'm not sure why they don't just cache the websites and avoid going back for at least 24 hours, especially in the case of most sites. I swear its like we're re-learning software engineering basics with LLMs / AI and it kills me.
It's because they don't give a shit whether the product works properly or not. By blocking AI scraping, sites are forcing AI companies to scrape faster before they're blocked. And faster means sloppier.