←back to thread

489 points todsacerdoti | 1 comments | | HN request time: 0.21s | source
Show context
wyldfire ◴[] No.44382903[source]
I understand where this comes from but I think it's a mistake. I agree it would be nice if there were "well settled law" regarding AI and copyright, probably relatively few rulings and next to zero legislation on which to base their feelings.

In addition to a policy to reject contributions from AI, I think it may make sense to point out places where AI generated content can be used. For example - how much of QEMU project's (copious) CI setup is really stuff that is critical content to protect? What about ever-more interesting test cases or environments that could be enabled? Something like "contribute those things here instead, and make judicious use of AI there, with these kinds of guard rails..."

replies(5): >>44382957 #>>44382958 #>>44383166 #>>44383312 #>>44383370 #
1. pavon ◴[] No.44383166[source]
This isn't like some other legal questions that go decades before being answered in court. There are dozens of cases working through the courts today that will shed light on some aspects of the copyright questions within a few years. QEMU has made great progress over the last 22 years without the aid of AI, waiting a few more years isn't going to hurt them.