←back to thread

3337 points keepamovin | 2 comments | | HN request time: 1.439s | source
Show context
jll29 ◴[] No.46216933[source]
AI professor here. I know this page is a joke, but in the interest of accuracy, a terminological comment: we don't call it a "hallucination" if a model complies exactly with what a prompt asked for and produces a prediction, exactly as requested.

Rater, "hallucinations" are spurious replacements of factual knowledge with fictional material caused by the use of statistical process (the pseudo random number generator used with the "temperature" parameter of neural transformers): token prediction without meaning representation.

[typo fixed]

replies(12): >>46217033 #>>46217061 #>>46217166 #>>46217410 #>>46217456 #>>46217758 #>>46218070 #>>46218282 #>>46218393 #>>46218588 #>>46219018 #>>46219935 #
1. jotaen ◴[] No.46218393[source]
To me, “imagine” would have been a more fitting term here.

(“Generate”, while correct, sounds too technical, and “confabulate” reads a bit obscure.)

replies(1): >>46218431 #
2. tangwwwei ◴[] No.46218431[source]
"imagine" gives too much credence of humanity to this action which will continue the cognitive mistake we make of anthropomorphzing llms