←back to thread

306 points slyall | 2 comments | | HN request time: 0.001s | source
Show context
aithrowawaycomm ◴[] No.42060762[source]
I think there is a slight disconnect here between making AI systems which are smart and AI systems which are useful. It’s a very old fallacy in AI: pretending tools which assist human intelligence by solving human problems must themselves be intelligent.

The utility of big datasets was indeed surprising, but that skepticism came about from recognizing the scaling paradigm must be a dead end: vertebrates across the board require less data to learn new things, by several orders of magnitude. Methods to give ANNs “common sense” are essentially identical to the old LISP expert systems: hard-wiring the answers to specific common-sense questions in either code or training data, even though fish and lizards can rapidly make common-sense deductions about manmade objects they couldn’t have possibly seen in their evolutionary histories. Even spiders have generalization abilities seemingly absent in transformers: they spin webs inside human homes with unnatural geometry.

Again it is surprising that the ImageNet stuff worked as well as it did. Deep learning is undoubtedly a useful way to build applications, just like Lisp was. But I think we are about as close to AGI as we were in the 80s, since we have made zero progress on common sense: in the 80s we knew Big Data can poorly emulate common sense, and that’s where we’re at today.

replies(5): >>42061007 #>>42061232 #>>42068100 #>>42068802 #>>42070712 #
j_bum ◴[] No.42061007[source]
> vertebrates across the board require less data to learn new things, by several orders of magnitude.

Sometimes I wonder if it’s fair to say this.

Organisms have had billions of years of training. We might come online and succeed in our environments with very little data, but we can’t ignore the information that’s been trained into our DNA, so to speak.

What’s billions of years of sensory information that drove behavior and selection, if not training data?

replies(10): >>42062463 #>>42064030 #>>42064183 #>>42064895 #>>42068159 #>>42070063 #>>42071450 #>>42075819 #>>42078291 #>>42085475 #
loa_in_ ◴[] No.42085475[source]
I also think this is a lazy claim. We have so so many internal sources of information like the feeling of temperature or vestibular system reacting to anything from an inclination change to effective power output of heart in real time every second of the day.
replies(1): >>42087438 #
j_bum ◴[] No.42087438{3}[source]
That’s a fair point. But to push back, how many sources of sensory information are needed for cognition to arise in humans?

I would be willing to bet that hearing or vision alone would be sufficient to develop cognition. Many of these extra senses are beneficial for survival, but not required for cognition. E.g., we don’t need smell/touch/taste/pain to think.

Thoughts?

replies(1): >>42094948 #
1. krschacht ◴[] No.42094948{4}[source]
I think we need the other senses for cognition. The other senses are part of the reward function which the cognitive learning algorithms optimize for. Pleasure and pain, and joy and suffering, guide the cognitive development process.
replies(1): >>42095065 #
2. j_bum ◴[] No.42095065[source]
I think you’re starting to conflate emotion with senses.

Yes pain is a form of sensory experience, but it also has affective/emotional components that can be experienced even without the presence of noxious stimuli.

However, there are people that don’t experience pain (congenital insensitivity to pain), which is caused by mutations in the NaV1.7 channel, or in one or more of the thermo/chemo/mechanotransducers that encode noxious stimuli into neural activity.

And obviously, these people who don’t experience the sensory discriminative components of pain are still capable of cognition.

To steelman your argument, I do agree that lacking all but one of what I would call the sufficient senses for cognition would dramatically slow down the rate of cognitive development. But I don’t think they would prohibit it.