←back to thread

56 points trott | 1 comments | | HN request time: 0s | source
Show context
makapuf ◴[] No.40714795[source]
Funny that it does not need that much data to train your average 20th century human genius. I'd say that if we are dreaming of the future of ai, learning and reasoning seems the greatest issue, not data. That said, the article title is about LLMs, so that's what will need changing I guess.
replies(3): >>40715430 #>>40715643 #>>40716666 #
jstanley ◴[] No.40715430[source]
Humans aren't just text interfaces though. The majority of your input is not textual but is sights, sounds, feelings, etc., that LLMs don't (yet?) have access to.

Humans receive an enormous amount of training data in forms not currently available to LLMs.

If you locked baby Einstein in a room with the collected works of humanity and left him there for a lifetime, I doubt he'd have even learnt to read on his own.

replies(6): >>40715609 #>>40715647 #>>40715822 #>>40715950 #>>40716247 #>>40716485 #
trott ◴[] No.40715822[source]
The stream of data from vision does NOT explain why humans learn 1000x faster: Children who lost their sight early on, can grow up to be intelligent. They can learn English, for example. They don't need to hear 200B words, like GPT-3.
replies(3): >>40716628 #>>40716999 #>>40720531 #
bhickey ◴[] No.40716628[source]
The human brain isn't randomly initialized. It's undergone 500m years of pretraining.
replies(2): >>40717032 #>>40719440 #
1. trott ◴[] No.40719440[source]
> The human brain isn't randomly initialized. It's undergone 500m years of pretraining.

All of the information accumulated by evolution gets passed through DNA. For humans, that's well under 1GB. Probably a very tiny fraction of that determines how the brain works at the algorithmic level. You should think of this information as the "software" of the brain, not pretrained LLM weights (350GB for GPT-3).