←back to thread

251 points slyall | 6 comments | | HN request time: 0s | source | bottom
Show context
aithrowawaycomm ◴[] No.42060762[source]
I think there is a slight disconnect here between making AI systems which are smart and AI systems which are useful. It’s a very old fallacy in AI: pretending tools which assist human intelligence by solving human problems must themselves be intelligent.

The utility of big datasets was indeed surprising, but that skepticism came about from recognizing the scaling paradigm must be a dead end: vertebrates across the board require less data to learn new things, by several orders of magnitude. Methods to give ANNs “common sense” are essentially identical to the old LISP expert systems: hard-wiring the answers to specific common-sense questions in either code or training data, even though fish and lizards can rapidly make common-sense deductions about manmade objects they couldn’t have possibly seen in their evolutionary histories. Even spiders have generalization abilities seemingly absent in transformers: they spin webs inside human homes with unnatural geometry.

Again it is surprising that the ImageNet stuff worked as well as it did. Deep learning is undoubtedly a useful way to build applications, just like Lisp was. But I think we are about as close to AGI as we were in the 80s, since we have made zero progress on common sense: in the 80s we knew Big Data can poorly emulate common sense, and that’s where we’re at today.

replies(5): >>42061007 #>>42061232 #>>42068100 #>>42068802 #>>42070712 #
j_bum ◴[] No.42061007[source]
> vertebrates across the board require less data to learn new things, by several orders of magnitude.

Sometimes I wonder if it’s fair to say this.

Organisms have had billions of years of training. We might come online and succeed in our environments with very little data, but we can’t ignore the information that’s been trained into our DNA, so to speak.

What’s billions of years of sensory information that drove behavior and selection, if not training data?

replies(7): >>42062463 #>>42064030 #>>42064183 #>>42064895 #>>42068159 #>>42070063 #>>42071450 #
1. marcosdumay ◴[] No.42070063[source]
> but we can’t ignore the information that’s been trained into our DNA

There's around 600MB in our DNA. Subtract this from the size of any LLM out there and see how much you get.

replies(1): >>42072096 #
2. myownpetard ◴[] No.42072096[source]
A more fair comparison would be subtract it from the size the of source code required to represent the LLM.
replies(2): >>42072476 #>>42072730 #
3. nick3443 ◴[] No.42072476[source]
More like the source code AND the complete design for a 200+ degree of freedom robot with batteries etc. pretty amazing.

It's like a 600mb demoscene demo for Conway's game of life!

replies(1): >>42073032 #
4. marcosdumay ◴[] No.42072730[source]
The source code is the weights. That's what they learn.
replies(1): >>42073049 #
5. Terr_ ◴[] No.42073032{3}[source]
That's underselling the product, a swarm of nanobots that are (literally, currently) beyond human understanding that are also the only way to construct certain materials and systems.

Inheritor of the Gray Goo apocalypse that covered the planet, this kind constructs an enormous mobile mega-fortress with a literal hive-mind, scouring the environment for raw materials and fending off hacking attempts by other nanobots. They even simulate other hive-minds to gain an advantage.

6. myownpetard ◴[] No.42073049{3}[source]
I disagree. A neural network is not learning it's source code. The source code specifies the model structure and hyperparameters. Then it compiled and instantiated into some physical medium, usually a bunch of GPUs, and weights are learned.

Our DNA specifies the model structure and hyperparameters for our brains. Then it is compiled and instantiated into a physical medium, our bodies, and our connectome is trained.

If you want to make a comparison about the quantity of information contained in different components of an artificial and a biological system, then it only makes sense if you compare apples to apples. DNA:Code :: Connectome:Weights