←back to thread

490 points todsacerdoti | 1 comments | | HN request time: 0.319s | source
Show context
Havoc ◴[] No.44382839[source]
I wonder whether the motivation is really legal? I get the sense that some projects are just sick of reviewing crap AI submissions
replies(6): >>44382854 #>>44382954 #>>44383005 #>>44383017 #>>44383164 #>>44383177 #
Lerc ◴[] No.44383005[source]
I'm not sure which way AI would move the dial when it comes to the median submission. Humans can, and do, make some crap code.

If the problem is too many submissions, that would suggest there needs to be structures in place to manage that.

Perhaps projects receiving lage quanties of updates need triage teams. I suspect most of the submissions are done in good faith.

I can see some people choosing to avoid AI due to the possibility of legal issues. I'm doubtful of the likelihood of such problems, but some people favour eliminating all possibly over minimizing likelihood. The philosopher in me feels like people who think they have eliminated the possibility of something just haven't thought about it enough.

replies(2): >>44383115 #>>44383122 #
1. ehnto ◴[] No.44383122[source]
Barrier of entry, automated submissions are two aspects I see changing with AI. You at least have to be able to code before submitting bad code.

With AI you're going to get job hunters automating PRs for big name projects so they can stick the contributions in their resume.