I wonder whether the motivation is really legal? I get the sense that some projects are just sick of reviewing crap AI submissions
replies(6):
If the problem is too many submissions, that would suggest there needs to be structures in place to manage that.
Perhaps projects receiving lage quanties of updates need triage teams. I suspect most of the submissions are done in good faith.
I can see some people choosing to avoid AI due to the possibility of legal issues. I'm doubtful of the likelihood of such problems, but some people favour eliminating all possibly over minimizing likelihood. The philosopher in me feels like people who think they have eliminated the possibility of something just haven't thought about it enough.
With AI you're going to get job hunters automating PRs for big name projects so they can stick the contributions in their resume.