←back to thread

3338 points keepamovin | 1 comments | | HN request time: 0.199s | source
Show context
jll29 ◴[] No.46216933[source]
AI professor here. I know this page is a joke, but in the interest of accuracy, a terminological comment: we don't call it a "hallucination" if a model complies exactly with what a prompt asked for and produces a prediction, exactly as requested.

Rater, "hallucinations" are spurious replacements of factual knowledge with fictional material caused by the use of statistical process (the pseudo random number generator used with the "temperature" parameter of neural transformers): token prediction without meaning representation.

[typo fixed]

replies(12): >>46217033 #>>46217061 #>>46217166 #>>46217410 #>>46217456 #>>46217758 #>>46218070 #>>46218282 #>>46218393 #>>46218588 #>>46219018 #>>46219935 #
1. saberience ◴[] No.46217758[source]
The OP clearly didn't mean "hallucination" as a bug or error in the AI, in the way you're suggesting. Words can have many different meanings!

You can easily say, Johnny had some wild hallucinations about a future where Elon Musk ruled the world. It just means it was some wild speculative thinking. I read this title in this sense of the world.

Not everything has to be nit-picked or overanalysed. This is an amusing article with an amusing title.