←back to thread

549 points orcul | 1 comments | | HN request time: 0.202s | source
Show context
Animats ◴[] No.41890003[source]
This is an important result.

The actual paper [1] says that functional MRI (which is measuring which parts of the brain are active by sensing blood flow) indicates that different brain hardware is used for non-language and language functions. This has been suspected for years, but now there's an experimental result.

What this tells us for AI is that we need something else besides LLMs. It's not clear what that something else is. But, as the paper mentions, the low-end mammals and the corvids lack language but have some substantial problem-solving capability. That's seen down at squirrel and crow size, where the brains are tiny. So if someone figures out to do this, it will probably take less hardware than an LLM.

This is the next big piece we need for AI. No idea how to do this, but it's the right question to work on.

[1] https://www.nature.com/articles/s41586-024-07522-w.epdf?shar...

replies(35): >>41890104 #>>41890470 #>>41891063 #>>41891228 #>>41891262 #>>41891383 #>>41891507 #>>41891639 #>>41891749 #>>41892068 #>>41892137 #>>41892518 #>>41892576 #>>41892603 #>>41892642 #>>41892738 #>>41893400 #>>41893534 #>>41893555 #>>41893732 #>>41893748 #>>41893960 #>>41894031 #>>41894713 #>>41895796 #>>41895908 #>>41896452 #>>41896476 #>>41896479 #>>41896512 #>>41897059 #>>41897270 #>>41897757 #>>41897835 #>>41905326 #
danielmarkbruce ◴[] No.41891063[source]
Is it important? To who? Anyone with half a brain is aware that language isn't the only way to think. I can think my way through all kinds of things in 3-d space without a single word uttered in any internal monologue and I'm not remotely unique - this kind of thing is put in all kinds of math and iq'ish like tests one takes as a child.
replies(1): >>41891439 #
voxl ◴[] No.41891439[source]
Before you say things this patiently dumb you should probably wonder what question the researchers are actually interested in and why your average experience isn't sufficient proof.
replies(3): >>41891491 #>>41891908 #>>41892583 #
1. danielmarkbruce ◴[] No.41892583[source]
It's "patently" and maybe understand the definition of "average" before using it.

Once you've figured out how to use language, explain why this is important and to who. Then maybe what the upshot will be. The fact that someone has proven something to be true doesn't make it important.

The comment I replied to made it sound like it's important to the field of AI. It is not. Almost zero serious researchers think LLMs all by themselves are "enough". People are working on all manner of models and systems incorporating all kinds of things "not LLM". Practically no one who actually works in AI reads this paper and changes anything, because it only proves something they already believed to be true and act accordingly.