←back to thread

222 points futurisold | 2 comments | | HN request time: 0s | source
Show context
thallukrish ◴[] No.44402063[source]
Since code is generated by LLM these days, how does specific syntactic constructs like a Symbol which essentially carries the context and can be manipulated with python operators help when compared to a normal python code generated by LLM with all the checks and balances instructed by a human? For example, I can write in this syntax to convert all fruits to vegetables or I can simply prompt an LLM to construct a program that takes a list of fruits and calls a LLM in the background to return the vegetables equivalent. I am trying to understand the difference.
replies(1): >>44402096 #
1. ijustlovemath ◴[] No.44402096[source]
Hallucination obstruction, I'd imagine. When you have an LLM create a formal system, it can be verified way easier than a general purpose one
replies(1): >>44402144 #
2. thallukrish ◴[] No.44402144[source]
Yes. That seems to be the case. While it may not be saving any time compared to generating general python code vs. specific symbolic code, the real value could be that it has an engine to enforce the contract on LLM responses with the library or even do the calls to the LLM as a common piece of code making it less error prone and bringing consistency in the interactions with the LLM.