←back to thread

63 points tejonutella | 1 comments | | HN request time: 0.343s | source
1. ninetyninenine ◴[] No.43304560[source]
>Humans don't need to learn from 1 trillion words to reach human intelligence. What are LLMs missing?

A Yann Lecuun quote from the page.

LLMs are blank slates. Humans have millions of years of pretraining recorded in the neural network not as weights but as the structure of the neural network itself. Our physical bodies are biased towards living in an oxygen environment with light and ground and our brains are biased in that same direction.

If you put humans in a completely different context. Like utterly completely brand new context say like 50 dimensional space where all our sense become useless and logic and common sense is completely overridden with new rules...

the Human will perform WORSE then the LLM when trained on the same text.