/top/
/new/
/best/
/ask/
/show/
/job/
^
slacker news
login
about
←back to thread
Hallucinations in code are the least dangerous form of LLM mistakes
(simonwillison.net)
371 points
ulrischa
| 1 comments |
02 Mar 25 19:15 UTC
|
HN request time: 0.226s
|
source
1.
mediumsmart
◴[
03 Mar 25 06:31 UTC
]
No.
43238916
[source]
▶
>>43233903 (OP)
#
As a non programmer I only get little programs or scripts that do something from the LLM. If they do the thing it means the code is tested, flawless and done. I would never let them have to deal with other humans Input of course.
ID:
GO
↑