←back to thread

3338 points keepamovin | 1 comments | | HN request time: 0.292s | source
Show context
jll29 ◴[] No.46216933[source]
AI professor here. I know this page is a joke, but in the interest of accuracy, a terminological comment: we don't call it a "hallucination" if a model complies exactly with what a prompt asked for and produces a prediction, exactly as requested.

Rater, "hallucinations" are spurious replacements of factual knowledge with fictional material caused by the use of statistical process (the pseudo random number generator used with the "temperature" parameter of neural transformers): token prediction without meaning representation.

[typo fixed]

replies(12): >>46217033 #>>46217061 #>>46217166 #>>46217410 #>>46217456 #>>46217758 #>>46218070 #>>46218282 #>>46218393 #>>46218588 #>>46219018 #>>46219935 #
ayewo ◴[] No.46217410[source]
Terminology-wise, does this read like a better title instead?:

Show HN: Gemini Pro 3 generates the HN front page 10 years from now

replies(2): >>46218366 #>>46219720 #
1. tim333 ◴[] No.46218366[source]
I'd vote for imagines.