←back to thread

A non-anthropomorphized view of LLMs

(addxorrol.blogspot.com)
475 points zdw | 2 comments | | HN request time: 0.41s | source
Show context
dtj1123 ◴[] No.44488004[source]
It's possible to construct a similar description of whatever it is that human brain is doing that clearly fails to capture the fact that we're conscious. If you take a cross section of every nerve feeding into the human brain at a given time T, the action potentials across those cross sections can be embedded in R^n. If you take the history of those action potentials across the lifetime of the brain, you get a path through R^n that is continuous, and maps roughly onto your subjectively experienced personal history, since your brain neccesarily builds your experienced reality from this signal data moment to moment. If you then take the cross sections of every nerve feeding OUT of your brain at time T, you have another set of action potentials that can be embedded in R^m which partially determines the state of the R^n embedding at time T + delta. This is not meaningfully different from the higher dimensional game of snake described in the article, more or less reducing the experience of being a human to 'next nerve impulse prediction', but it obviously fails to capture the significance of the computation which determines what that next output should be.
replies(2): >>44488152 #>>44488197 #
1. bravesoul2 ◴[] No.44488152[source]
Brain probably isn't modelled as real but as natural or rational numbers. This is my suspicion. The reals just hold too much information.
replies(1): >>44488589 #
2. dtj1123 ◴[] No.44488589[source]
Inclined to agree, but most thermal physics uses the reals as they're simpler to work with, so I think they're ok here for the purpose of argument.