←back to thread

549 points orcul | 1 comments | | HN request time: 0.215s | source
Show context
Animats ◴[] No.41890003[source]
This is an important result.

The actual paper [1] says that functional MRI (which is measuring which parts of the brain are active by sensing blood flow) indicates that different brain hardware is used for non-language and language functions. This has been suspected for years, but now there's an experimental result.

What this tells us for AI is that we need something else besides LLMs. It's not clear what that something else is. But, as the paper mentions, the low-end mammals and the corvids lack language but have some substantial problem-solving capability. That's seen down at squirrel and crow size, where the brains are tiny. So if someone figures out to do this, it will probably take less hardware than an LLM.

This is the next big piece we need for AI. No idea how to do this, but it's the right question to work on.

[1] https://www.nature.com/articles/s41586-024-07522-w.epdf?shar...

replies(35): >>41890104 #>>41890470 #>>41891063 #>>41891228 #>>41891262 #>>41891383 #>>41891507 #>>41891639 #>>41891749 #>>41892068 #>>41892137 #>>41892518 #>>41892576 #>>41892603 #>>41892642 #>>41892738 #>>41893400 #>>41893534 #>>41893555 #>>41893732 #>>41893748 #>>41893960 #>>41894031 #>>41894713 #>>41895796 #>>41895908 #>>41896452 #>>41896476 #>>41896479 #>>41896512 #>>41897059 #>>41897270 #>>41897757 #>>41897835 #>>41905326 #
1. necovek ◴[] No.41893400[source]
You seem to be conflating "different hardware" with proof that "language hardware" uses "software" equivalent to LLMs.

LLMs basically become practical when you simply scale compute up, and maybe both regions are "general compute", but language ends up on the "GPU" out of pure necessity.

So to me, these are entirely distinct questions: is the language region able to do general cognitive operations? What happens when you need to spell out "ubiquitous" or declense a foreign word in a language with declension (which you don't have memory patterns for)?

I agree it seems obvious that for better efficiency (size of training data, parameter count, compute ability), human brains use different approach than LLMs today (in a sibling comment, I bring up an example of my kids at 2yo having a better grasp of language rules than ChatGPT with 100x more training data).

But let's dive deeper in understanding what each of these regions can do before we decide to compare to or apply stuff from AI/CS.