←back to thread

625 points lukebennett | 1 comments | | HN request time: 0s | source
Show context
devit ◴[] No.42141624[source]
It seems obvious to me that Common Crawl plus Github public repositories have more than an enough data to train an AI that is as good as any programmer (at tasks not requiring knowledge of non-public codebases or non-public domain knowledge).

So the problem is more in the algorithm.

replies(1): >>42141675 #
darknoon ◴[] No.42141675[source]
I think just reading the code wouldn't make you a good programmer, you'd need to "read" the anti-code, ie what doesn't work, by trial and error. Models overconfidence that their code will work often leads them to fail in practice.
replies(1): >>42141971 #
1. krisroadruck ◴[] No.42141971[source]
AlphaGo got better by playing against itself. I wonder if the pathway forward here is to essentially do the same with coding. Feed it some arbitrary SRS documents - have it attempt to develop them including full code coverage testing. Have it also take on roles of QA, stakeholders, red-team security researchers, and users who are all aggressively trying to find edge cases and point out everything wrong with the application. Have it keep iterating and learn from the findings. Keep feeding it new novel SRSs until the number off attempts/iterations necessary to get a quality product out the other side drops to some acceptable number.