←back to thread

A non-anthropomorphized view of LLMs

(addxorrol.blogspot.com)
475 points zdw | 2 comments | | HN request time: 0s | source
Show context
mewpmewp2 ◴[] No.44485205[source]
My question: how do we know that this is not similar to how human brains work. What seems intuitively logical to me is that we have brains evolved through evolutionary process via random mutations yielding in a structure that has its own evolutionary reward based algorithms designing it yielding a structure that at any point is trying to predict next actions to maximise survival/procreation, of course with a lot of sub goals in between, ultimately becoming this very complex machinery, but yet should be easily simulated if there was enough compute in theory and physical constraints would allow for it.

Because, morals, values, consciousness etc could just be subgoals that arised through evolution because they support the main goals of survival and procreation.

And if it is baffling to think that a system could rise up, how do you think it is possible life and humans came to existence in the first place? How could it be possible? It is already happened from a far unlikelier and strange place. And wouldn't you think the whole World and the timeline in theory couldn't be represented as a deterministic function. And if not then why should "randomness" or anything else bring life to existence.

replies(4): >>44485240 #>>44485258 #>>44485273 #>>44488508 #
1. latexr ◴[] No.44488508[source]
> how do we know that this is not similar to how human brains work.

Do you forget every conversation as soon as you have them? When speaking to another person, do they need to repeat literally everything they said and that you said, in order, for you to retain context?

If not, your brain does not work like an LLM. If yes, please stop what you’re doing right now and call a doctor with this knowledge. I hope Memento (2000) was part of your training data, you’re going to need it.

replies(1): >>44491005 #
2. mewpmewp2 ◴[] No.44491005[source]
Knowledge of every conversation must be some form of state in our minds, just like for LLMs it could be something retrieved from a database, no? I don't think information storing or retrieval is necessarily the most important achievements here in the first place. It's the emergent abilities that you wouldn't have expected to occur.