←back to thread

765 points MindBreaker2605 | 1 comments | | HN request time: 0.36s | source
1. joegibbs ◴[] No.45898374[source]
Right choice IMO. LLMs aren’t going to reach AGI by themselves because language is a thing by itself, very good at encoding concepts into compact representations but doesn’t necessarily have any relation to reality. A human being gets years of binocular visuals of real things, sound input, other various sensations, much less than what we’re training these models with. We think of language in terms of sounds and pictures rather than abstract language.