←back to thread

1246 points adrianh | 9 comments | | HN request time: 1.067s | source | bottom
1. JimDabell ◴[] No.44491678[source]
I wrote this the other day:

> Hallucinations can sometimes serve the same role as TDD. If an LLM hallucinates a method that doesn’t exist, sometimes that’s because it makes sense to have a method like that and you should implement it.

https://www.threads.com/@jimdabell/post/DLek0rbSmEM

I guess it’s true for product features as well.

replies(2): >>44491913 #>>44496172 #
2. jjcm ◴[] No.44491913[source]
Seems like lots of us have stumbled on this. It’s not the worst way to dev!

> Maybe hallucinations of vibe coders are just a suggestion those API calls should have existed in the first place.

> Hallucination-driven-development is in.

https://x.com/pwnies/status/1922759748014772488?s=46&t=bwJTI...

replies(2): >>44494898 #>>44496186 #
3. NooneAtAll3 ◴[] No.44494898[source]
inb4 "Ai thinks there should be a StartThermonuclearWar() function, I should make that"
replies(1): >>44495136 #
4. blharr ◴[] No.44495136{3}[source]
In a combat simulator, absolutely
replies(1): >>44495262 #
5. burnt-resistor ◴[] No.44495262{4}[source]
The only winning move is ...
6. AdieuToLogic ◴[] No.44496172[source]
> I wrote this the other day:

>> Hallucinations can sometimes serve the same role as TDD. If an LLM hallucinates a method that doesn’t exist, sometimes that’s because it makes sense to have a method like that and you should implement it.

A detailed counterargument to this position can be found here[0]. In short, what is colloquially described as "LLM hallucinations" do not serve any plausible role in software design other than to introduce an opportunity for software engineers to stop and think about the problem being solved.

See also Clark's third law[1].

0 - https://addxorrol.blogspot.com/2025/07/a-non-anthropomorphiz...

1 - https://en.wikipedia.org/wiki/Clarke%27s_three_laws

replies(1): >>44496650 #
7. TZubiri ◴[] No.44496186[source]
Beware, the feature in OP isn't something that people would have found useful, it's not like chatgpt assigned to OP's business a request from a user in some latent consumer-provider space, as if chatgpt were some kind of market maker connecting consumers with products, like a google with organic content or ads, or linkedin or producthunt.

No, what actually happened is that OP developed a type of chatgpt integration, and a shitty one at that, chatgpt could have just directed the user to the site and told them to upload that image to OP's site. But it felt it needed to do something with the image, so it did.

There's no new value add here, at least yet, maybe if users started requesting changes to the sheet I guess, not what's going on.

replies(1): >>44496737 #
8. JimDabell ◴[] No.44496650[source]
Did you mean to post a different link? The article you linked isn’t a detailed counterargument to my position and your summary of it does not match its contents either.

I also don’t see the relevance of Clarke’s third law.

9. JimDabell ◴[] No.44496737{3}[source]
> the feature in OP isn't something that people would have found useful

This doesn’t seem likely. The utility is pretty obvious.

> chatgpt could have just directed the user to the site and told them to upload that image to OP's site.

What image? Did you think the first image shown is what is being entered into ChatGPT? It’s not. That’s what the site expects to be uploaded to them. There’s no indication that the ChatGPT users are scanning tabs. ChatGPT is producing ASCII tabs, but we aren’t shown what input it is in response to.