←back to thread

423 points sohkamyung | 1 comments | | HN request time: 0.22s | source
1. Pocomon ◴[] No.45674613[source]
Large Language Models (LLMs), lacking true comprehension of the underlying concepts, convert sequences of text into numerical vectors known as tokens. Using a prediction engine together with user input, attempt to predict the next token in the sequence. As such - it's all hallucinations.