←back to thread

283 points summarity | 1 comments | | HN request time: 0.204s | source
Show context
ryandrake ◴[] No.44369008[source]
Receiving hundreds of AI generated bug reports would be so demoralizing and probably turn me off from maintaining an open source project forever. I think developers are going to eventually need tools to filter out slop. If you didn’t take the time to write it, why should I take the time to read it?
replies(7): >>44369097 #>>44369153 #>>44369155 #>>44369386 #>>44369772 #>>44369954 #>>44370907 #
teeray ◴[] No.44369155[source]
You see, the dream is another AI that reads the report and writes the issue in the bug tracker. Then another AI implements the fix. A third AI then reviews the code and approves and merges it. All without human interaction! Once CI releases the fix, the first AI can then find the same vulnerability plus a few new and exciting ones.
replies(1): >>44369220 #
dingnuts ◴[] No.44369220[source]
This is completely absurd. If generating code is reliable, you can have one generator make the change, and then merge and release it with traditional software.

If it's not reliable, how can you rely on the written issue to be correct, or the review, and so how does that benefit you over just blindly merging whatever changes are created by the model?

replies(2): >>44369277 #>>44369684 #
1. tempodox ◴[] No.44369277[source]
Making sense is not required as long as “AI” vendors sell subscriptions.