←back to thread

549 points orcul | 2 comments | | HN request time: 0.409s | source
Show context
Animats ◴[] No.41890003[source]
This is an important result.

The actual paper [1] says that functional MRI (which is measuring which parts of the brain are active by sensing blood flow) indicates that different brain hardware is used for non-language and language functions. This has been suspected for years, but now there's an experimental result.

What this tells us for AI is that we need something else besides LLMs. It's not clear what that something else is. But, as the paper mentions, the low-end mammals and the corvids lack language but have some substantial problem-solving capability. That's seen down at squirrel and crow size, where the brains are tiny. So if someone figures out to do this, it will probably take less hardware than an LLM.

This is the next big piece we need for AI. No idea how to do this, but it's the right question to work on.

[1] https://www.nature.com/articles/s41586-024-07522-w.epdf?shar...

replies(35): >>41890104 #>>41890470 #>>41891063 #>>41891228 #>>41891262 #>>41891383 #>>41891507 #>>41891639 #>>41891749 #>>41892068 #>>41892137 #>>41892518 #>>41892576 #>>41892603 #>>41892642 #>>41892738 #>>41893400 #>>41893534 #>>41893555 #>>41893732 #>>41893748 #>>41893960 #>>41894031 #>>41894713 #>>41895796 #>>41895908 #>>41896452 #>>41896476 #>>41896479 #>>41896512 #>>41897059 #>>41897270 #>>41897757 #>>41897835 #>>41905326 #
KoolKat23 ◴[] No.41890470[source]
> What this tells us for AI is that we need something else besides LLMs.

Basically we need Multimodal LLM's (terrible naming as it's not an LLM then but still).

replies(1): >>41890645 #
Animats ◴[] No.41890645[source]
I don't know what we need. Nor does anybody else, yet. But we know what it has to do. Basically what a small mammal or a corvid does.

There's been progress. Look at this 2020 work on neural net controlled drone acrobatics.[1] That's going in the right direction.

[1] https://rpg.ifi.uzh.ch/docs/RSS20_Kaufmann.pdf

replies(2): >>41890769 #>>41891715 #
fuzzfactor ◴[] No.41890769[source]
You could say language is just the "communication module" but there has got to be another whole underlying interface where non-verbal thoughts are modulated/demodulated to conform to the language expected to be used when communication may or may not be on the agenda.
replies(3): >>41891260 #>>41891635 #>>41891786 #
bbor ◴[] No.41891260[source]
Well said! This is a great restatement of the core setup of the Chomskian “Generative Grammar” school, and I think it’s an undeniably productive one. I haven’t read this researchers full paper, but I would be sad (tho not shocked…) if it didn’t cite Chomsky up front. Beyond your specific point re:interfaces—which I recommend the OG Syntactic Structures for more commentary on—he’s been saying what she’s saying here for about half a century. He’s too humble/empirical to ever say it without qualifiers, but IMO the truth is clear when viewed holistically: language is a byproduct of hierarchical thought, not the progenitor.

This (awesome!) researcher would likely disagree with what I’ve just said based on this early reference:

  In the early 2000s I really was drawn to the hypothesis that maybe humans have some special machinery that is especially well suited for computing hierarchical structures.
…with the implication that they’re not, actually. But I think that’s an absurd overcorrection for anthropological bias — humans are uniquely capable of a whole host of tasks, and the gradation is clearly a qualitative one. No ape has ever asked a question, just like no plant has ever conceptualized a goal, and no rock has ever computed indirect reactions to stimuli.
replies(2): >>41891350 #>>41891737 #
slibhb ◴[] No.41891350[source]
Chomsky is shockingly unhumble. I admire him but he's a jerk who treats people who disagree with him with contempt. It's fun to read him doing this but it's uncollegiate (to say the least).

Also, calling "generative grammar" productive seems wrong to me. It's been around for half a century -- what tools has it produced? At some point theory needs to come into contact with empirical reality. As far as I know, generative grammar has just never gotten to this point.

replies(2): >>41891409 #>>41895665 #
1. bbor ◴[] No.41895665[source]
Well, it’s the basis of programming languages. That seems pretty helpful :) Otherwise it’s hard to measure what exactly “real world utility” looks like. What have the other branches of linguistics brought us? What has any human science brought us, really? Even the most empirical one, behavioral psychology, seems hard to correlate with concrete benefits. I guess the best case would be “helps us analyze psychiatric drug efficacy”?

Generally, I absolutely agree that he is not humble in the sense of expressing doubt about his strongly held beliefs. He’s been saying pretty much the same things for decades, and does not give much room for disagreement (and ofc this is all ratcheted up in intensity in his political stances). I’m using humble in a slightly different way, tho: he insists on qualifying basically all of his statements about archaeological anthropology with “we don’t have proof yet” and “this seems likely”, because of his fundamental belief that we’re in a “pre-Galilean” (read: shitty) era of cognitive science.

In other words: he’s absolutely arrogant about his core structural findings and the utility of his program, but he’s humble about the final application of those findings to humanity.

replies(1): >>41903909 #
2. slibhb ◴[] No.41903909[source]
It's a fair point that Chomsky's ideas about grammars are used in parsing programming languages. But linguistics is supposed to deal with natural languages -- what has Chomskyan linguistics accomplished there?

Contrast to the statistical approach. It's easy to point to something like Google translate. If Chomsky's approach gave us a tool like that, I'd have no complaint. But my sense is that it just hasn't panned out.