/top/
/new/
/best/
/ask/
/show/
/job/
^
slacker news
login
about
←back to thread
Hallucinations in code are the least dangerous form of LLM mistakes
(simonwillison.net)
371 points
ulrischa
| 2 comments |
02 Mar 25 19:15 UTC
|
HN request time: 0.492s
|
source
1.
cryptoegorophy
◴[
03 Mar 25 00:51 UTC
]
No.
43237104
[source]
▶
>>43233903 (OP)
#
Just ask another LLM to proof read?
replies(1):
>>43239693
#
ID:
GO
2.
namaria
◴[
03 Mar 25 08:43 UTC
]
No.
43239693
[source]
▶
>>43237104 (TP)
#
Do you realize that giving LLMs 'instructions' is merely trying to blindly twist knobs by random amounts?
↑