/top/
/new/
/best/
/ask/
/show/
/job/
^
slacker news
login
about
←back to thread
Hallucinations in code are the least dangerous form of LLM mistakes
(simonwillison.net)
371 points
ulrischa
| 1 comments |
02 Mar 25 19:15 UTC
|
HN request time: 0.204s
|
source
1.
amelius
◴[
03 Mar 25 12:44 UTC
]
No.
43241213
[source]
▶
>>43233903 (OP)
#
I don't agree. What if the LLM takes a two-step approach, where it first determines a global architecture, and then it fills in the code? (Where it hallucinates in the first step).
ID:
GO
↑