←back to thread

371 points ulrischa | 1 comments | | HN request time: 0.401s | source
Show context
al2o3cr ◴[] No.43234929[source]

    My cynical side suspects they may have been looking for
    a reason to dismiss the technology and jumped at the first
    one they found.
MY cynical side suggests the author is an LLM fanboi who prefers not to think that hallucinating easy stuff strongly implies hallucinating harder stuff, and therefore jumps at the first reason to dismiss the criticism.
replies(2): >>43235138 #>>43237917 #
1. simonw ◴[] No.43237917[source]
I find it a bit surprising that I'm being called an "LLM fanboy" for writing an article with the title "Hallucinations in code are the least dangerous form of LLM mistakes" where the bulk of the article is about how you can't trust LLMs not to make far more serious and hard-to-spot logic errors.