←back to thread

549 points orcul | 1 comments | | HN request time: 0.263s | source
Show context
Animats ◴[] No.41890003[source]
This is an important result.

The actual paper [1] says that functional MRI (which is measuring which parts of the brain are active by sensing blood flow) indicates that different brain hardware is used for non-language and language functions. This has been suspected for years, but now there's an experimental result.

What this tells us for AI is that we need something else besides LLMs. It's not clear what that something else is. But, as the paper mentions, the low-end mammals and the corvids lack language but have some substantial problem-solving capability. That's seen down at squirrel and crow size, where the brains are tiny. So if someone figures out to do this, it will probably take less hardware than an LLM.

This is the next big piece we need for AI. No idea how to do this, but it's the right question to work on.

[1] https://www.nature.com/articles/s41586-024-07522-w.epdf?shar...

replies(35): >>41890104 #>>41890470 #>>41891063 #>>41891228 #>>41891262 #>>41891383 #>>41891507 #>>41891639 #>>41891749 #>>41892068 #>>41892137 #>>41892518 #>>41892576 #>>41892603 #>>41892642 #>>41892738 #>>41893400 #>>41893534 #>>41893555 #>>41893732 #>>41893748 #>>41893960 #>>41894031 #>>41894713 #>>41895796 #>>41895908 #>>41896452 #>>41896476 #>>41896479 #>>41896512 #>>41897059 #>>41897270 #>>41897757 #>>41897835 #>>41905326 #
1. ddingus ◴[] No.41893732[source]
We should look to the animals.

Higher order faculties aside, animals seem like us, just simpler.

The higher functioning ones appear to have this missing thing too. We can see it in action. Perhaps all of them do and it is just harder for us when the animal thinks very differently or maybe does not think as much, feeling more, for example.

----

Now, about that thing... and the controversy:

Given an organism, or machine for this discussion, is of sufficiently robust design and complexity that it can precisely differentiate itself from everything else, it is a being.

This thing we are missing is an emergent property, or artifact that can or maybe always does present when a state of being also presents.

We have not created a machine of this degree yet.

Mother nature has.

The reason for emergence is a being can differentiate sensory input as being from within, such as pain, or touch, and from without, such as light or motion.

Another way to express this is closed loop vs open loop.

A being is a closed loop system. It can experience cause and effect. It can be the cause. It can be the effect.

A lot comes from this closed loop.

There can be the concept of the self and it has real meaning due to the being knowing what is of itself or something, everything else.

This may be what forms consciousness. Consciousness may require a closed loop, and organism of sufficient complexity to be able to perceive itself.

That is the gist of it.

These systems we make are fantastic pieces. They can pattern match and identify relationships between the data given in amazing ways.

But they are open loop. They are not beings. They cannot determine what is part of them, what they even are,or anything really.

I am both consistently amazed and dismayed at what we can get LLM systems to do.

They are tantalizingly close!

We found a piece of how all this works and we are exploiting the cral out of it. Ok fine. Humans are really good at that.

But it will all taper off. There are real limits because we will eventually find the end goal will be to map out the whole problem space.

Who has tried computing that? It is basically all possible human thought. Not going to happen.

More is needed.

And that "more" can arrive at thoughts without having first seen a few bazillion to choose from.