←back to thread

549 points orcul | 4 comments | | HN request time: 0s | source
Show context
Animats ◴[] No.41890003[source]
This is an important result.

The actual paper [1] says that functional MRI (which is measuring which parts of the brain are active by sensing blood flow) indicates that different brain hardware is used for non-language and language functions. This has been suspected for years, but now there's an experimental result.

What this tells us for AI is that we need something else besides LLMs. It's not clear what that something else is. But, as the paper mentions, the low-end mammals and the corvids lack language but have some substantial problem-solving capability. That's seen down at squirrel and crow size, where the brains are tiny. So if someone figures out to do this, it will probably take less hardware than an LLM.

This is the next big piece we need for AI. No idea how to do this, but it's the right question to work on.

[1] https://www.nature.com/articles/s41586-024-07522-w.epdf?shar...

replies(35): >>41890104 #>>41890470 #>>41891063 #>>41891228 #>>41891262 #>>41891383 #>>41891507 #>>41891639 #>>41891749 #>>41892068 #>>41892137 #>>41892518 #>>41892576 #>>41892603 #>>41892642 #>>41892738 #>>41893400 #>>41893534 #>>41893555 #>>41893732 #>>41893748 #>>41893960 #>>41894031 #>>41894713 #>>41895796 #>>41895908 #>>41896452 #>>41896476 #>>41896479 #>>41896512 #>>41897059 #>>41897270 #>>41897757 #>>41897835 #>>41905326 #
jebarker ◴[] No.41891228[source]
> What this tells us for AI is that we need something else besides LLMs

Not to over-hype LLMs, but I don't see why this results says this. AI doesn't need to do things the same way as evolved intelligence has.

replies(7): >>41891277 #>>41891338 #>>41891540 #>>41891547 #>>41891924 #>>41892032 #>>41898302 #
weard_beard ◴[] No.41891277[source]
To a point. If you drill down this far into the fundamentals of cognition you begin to define it. Otherwise you may as well call a cantaloupe sentient
replies(1): >>41891297 #
jebarker ◴[] No.41891297[source]
I don't think anyone defines AI as "doing the thing that biological brains do" though, we define it in terms of capabilities of the system.
replies(1): >>41892416 #
weard_beard ◴[] No.41892416[source]
I think if you gave it the same biological inputs as a biological brain you would quickly see the lack of capabilities in any man made system.
replies(1): >>41893199 #
1. Dylan16807 ◴[] No.41893199[source]
Okay, but does that help us reach any meaningful conclusions? For example, okay some AI system doesn't have the capabilities of an auditory cortex or somatosensory cortex. Is there a reason for me to think it needs that?
replies(1): >>41895128 #
2. weard_beard ◴[] No.41895128[source]
Name a creature on earth without one.

Imagine trying to limit, control, or explain a being without familiar cognitive structures.

Is there a reason to care about such unfamiliar modalities of cognition?

replies(1): >>41898190 #
3. Dylan16807 ◴[] No.41898190[source]
> Name a creature on earth without one.

Anything that doesn't have a spine, I'm pretty sure.

Also if we look at just auditory, tons of creatures are deaf and don't need that.

> Imagine trying to limit, control, or explain a being without familiar cognitive structures.

I don't see why any of that that affects whether it's intelligent.

replies(1): >>41899935 #
4. weard_beard ◴[] No.41899935{3}[source]
Agreed: Perhaps we aught to be studying cognition of creatures without spines before we claim to replicate or understand cognition of creatures with them.

Presumably they have some sort biological input processing or sensory inputs. They don't eat data.