←back to thread

549 points orcul | 1 comments | | HN request time: 0.203s | source
1. necovek ◴[] No.41893324[source]
While getting confirmation of this relationship (or lack of it) is exciting, none of this is surprising: language is a tool we "developed" further through our cognitive processes, but ultimately other primates use language as well.

The one thing I wonder is if it's mostly "code duplication": iow, would we be able to develop language by using a different region of the brain, or do we actually do cognitive processes in the language part too?

In other words, is this simply deciding to send language processing to the GPU even if we could do it with the CPU (to illustrate my point)?

How would one even devise an experiment to prove or disprove this?

To me it seems obvious that our language generation and processing regions really involve cognition as well, as languages are very much rule based (even of they came up in reverse: first language then rules): could we get both regions to light up in brain imaging when we get to tricky words that we aren't sure how to spell or adapt to context like declensions of foreign words

> But you can build these models that are trained on only particular kinds of linguistic input or are trained on speech inputs as opposed to textual inputs.

As someone from this side of the "fence" (mathematics and CS, though currently obly a practicing software engineer), I don't think LLMs provide this opportunity that is in any way comparable to human minds.

Comparing performance of small kids developing their language skills (I've only had two, but one is enough to prove by contradiction) to LLMs (in particular for Serbian), LLMs like ChatGPT had a much broader vocabulary, but kids were much better at figuring out complex language rules with very limited number of inputs (noticed by them making mistakes on exceptions by following a "rule" at 2 years of age or younger).

The amount of training input GenAI needs is multiple orders of magnitude larger compared to young kids.

Though it's not a fair comparison: kids learn language by listening, immitation, watching, smelling, hearing and in context (you'll talk about bread at breakfast).

So let's be careful in considering LLMs a model of a human language process.