If AI companies want to sue webmasters for that then by all means, they can waste their money and get laughed out of court.
> You can choose to gatekeep your content, and by doing so, make it unscrapeable, and legally protected.
so... robots.txt, which the AI parasites ignore?
> Also, consider that relatively small, cheap llms are able to parse the difference between meaningful content and Markovian jabber such as this software produces.
okay, so it's not damaging, and there you've refuted your entire argument
We know for a fact that AI companies don't respect that, if they want data that's behind a paywall then they'll jump through hoops to take it anyway.
https://www.theguardian.com/technology/2025/jan/10/mark-zuck...
If they don't have to abide by "norms" then we don't have to for their sake. Fuck 'em.