←back to thread

108 points bertman | 1 comments | | HN request time: 0.22s | source
Show context
falcor84 ◴[] No.43821300[source]
> First, you cannot obtain the "theory" of a large program without actually working with that program...

> Second, you cannot effectively work on a large program without a working "theory" of that program...

I find the whole argument and particularly the above to be a senseless rejection of bootstrapping. Obviously there was a point in time (for any program, individual programmer and humanity as a whole) that we didn't have a "theory" and didn't do the work, but now we have both, so a program and its theory can appear "de novo".

So with that in mind, how can we reject the possibility that as an AI Agent (e.g. Aider) works on a program over time, it bootstraps a theory?

replies(4): >>43821340 #>>43821987 #>>43822329 #>>43822492 #
mrkeen ◴[] No.43821987[source]
> So with that in mind, how can we reject the possibility that as an AI Agent (e.g. Aider) works on a program over time, it bootstraps a theory?

That's the appropriate level of faith for today's LLMs. They're not good enough to replace programmers. They're good enough that we can't reject the possibility of them one day being good enough to replace programmers.

replies(2): >>43822172 #>>43822413 #
1. 2mlWQbCK ◴[] No.43822413[source]
And good enough does not mean "as good as". Companies happily outsource programming jobs to worse, but much cheaper, programmers, all the time.