←back to thread

70 points alexmolas | 1 comments | | HN request time: 0s | source
Show context
lukev ◴[] No.43644995[source]
This is the way LLM-enhanced coding should (and I believe will) go.

Treating the LLM like a compiler is a much more scalable, extensible and composable mental model than treating it like a junior dev.

replies(2): >>43645013 #>>43650449 #
simonw ◴[] No.43645013[source]
smartfunc doesn't really treat the LLM as a compiler - it's not generating Python code to fill out the function, it's converting that function into one that calls the LLM every time you call the function passing in its docstring as a prompt.

A version that DID work like a compiler would be super interesting - it could replace the function body with generated Python code on your first call and then reuse that in the future, maybe even caching state on disk rather than in-memory.

replies(6): >>43645175 #>>43645658 #>>43646624 #>>43647762 #>>43647875 #>>43650257 #
toxik ◴[] No.43645658[source]
Isn’t that basically just Copilot but way more cumbersome to use?
replies(1): >>43645761 #