/top/
/new/
/best/
/ask/
/show/
/job/
^
slacker news
login
about
←back to thread
Some thoughts on LLMs and software development
(martinfowler.com)
416 points
floverfelt
| 1 comments |
28 Aug 25 18:52 UTC
|
HN request time: 0.2s
|
source
Show context
sebnukem2
◴[
28 Aug 25 19:28 UTC
]
No.
45056066
[source]
▶
>>45055641 (OP)
#
> hallucinations aren’t a bug of LLMs, they are a feature. Indeed they are the feature. All an LLM does is produce hallucinations, it’s just that we find some of them useful.
Nice.
replies(7):
>>45056284
#
>>45056352
#
>>45057115
#
>>45057234
#
>>45057503
#
>>45057942
#
>>45061686
#
tptacek
◴[
28 Aug 25 19:48 UTC
]
No.
45056284
[source]
▶
>>45056066
#
In that framing, you can look at an agent as simply a filter on those hallucinations.
replies(4):
>>45056346
#
>>45056552
#
>>45056728
#
>>45058056
#
1.
th0ma5
◴[
28 Aug 25 19:55 UTC
]
No.
45056346
[source]
▶
>>45056284
#
Yes yes, with yet to be discovered holes
ID:
GO
↑