←back to thread

371 points ulrischa | 1 comments | | HN request time: 0.241s | source
1. noodletheworld ◴[] No.43238471[source]
If you want to use LLMs for code, use them.

If you don't, don't.

However, this 'lets move past hallucinations' discourse is just disingenuous.

The OP is conflating hallucinations, which are a fact, and undisputed failure mode of LLMs that no one has any solution for.

...and people not spending enough time and effort learning to use the tools.

I don't like it. It feels bad. It feels like a rage bait piece, cast out of frustration that the OP doesn't have an answer for hallucinations, because there isn't one.

> Hallucinated methods are such a tiny roadblock that when people complain about them I assume they’ve spent minimal time learning how to effectively use these systems—they dropped them at the first hurdle.

People aren't stupid.

If they use a tool and it sucks, they'll stop using it and say "this sucks".

If people are saying "this sucks" about AI, it's because the LLM tool they're using sucks, not because they're idiots, or there's a grand 'anti-AI' conspiracy.

People are lazy; if the tool is good (eg. cursor), people will use it.

If they use it, and the first thing it does is hallucinate some BS (eg. intellij full line completion), then you'll get people uninstalling it and leaving reviews like "blah blah hallucination blah blah. This sucks".

Which is literally what is happening. Right. Now.

To be fair 'blah blah hallucinations suck' is a common 'anti-AI' trope that gets rolled out.

...but that's because it is a real problem

Pretending 'hallucinations are fine, people are the problem' is... it's just disingenuous and embarrassing from someone of this caliber.