←back to thread

579 points paulpauper | 1 comments | | HN request time: 0s | source
Show context
photochemsyn ◴[] No.43603973[source]
Will LLMs end up like compilers? Compilers are also fundamentally important to modern industrial civilization - but they're not profit centers, they're mostly free and open-source outside a few niche areas. Knowing how to use a compiler effectively to write secure and performative software is still a valuable skill - and LLMs are a valuable tool that can help with that process, especially if the programmer is on the steep end of the learning curve - but it doesn't look like anything short of real AGI can do novel software creation without a human constantly in the loop. The same argument applies to new fundamental research, even to reviewing and analyzing new discoveries that aren't in the training corpus.

Wasn't it back in the 1980s that you had to pay $1000s for a good compiler? The entire LLM industry might just be following in the compiler's footsteps.

replies(1): >>43604300 #
lukev ◴[] No.43604300[source]
This seems like a probable end state, but we're going to have to stop calling LLMs "artificial intelligence" in order to get there.
replies(2): >>43604504 #>>43604539 #
1. bcoates ◴[] No.43604504[source]
Yep. I'm looking forward to LLMs/deepnets being considered a standard GOFAI technique with uses and limitations and not "we asked the God we're building to draw us a picture of a gun and then it did and we got scared"