←back to thread

A non-anthropomorphized view of LLMs

(addxorrol.blogspot.com)
475 points zdw | 1 comments | | HN request time: 0.208s | source
Show context
elliotto ◴[] No.44485762[source]
To claim that LLMs do not experience consciousness requires a model of how consciousness works. The author has not presented a model, and instead relied on emotive language leaning on the absurdity of the claim. I would say that any model one presents of consciousness often comes off as just as absurd as the claim that LLMs experience it. It's a great exercise to sit down and write out your own perspective on how consciousness works, to feel out where the holes are.

The author also claims that a function (R^n)^c -> (R^n)^c is dramatically different to the human experience of consciousness. Yet the author's text I am reading, and any information they can communicate to me, exists entirely in (R^n)^c.

replies(4): >>44485798 #>>44487957 #>>44488208 #>>44490162 #
1. seadan83 ◴[] No.44487957[source]
I believe the author is rather drawing this distinction:

LLMs: (R^n)^c -> (R^n)^c

Humans: [set of potentially many and complicated inputs that we effectively do not understand at all] -> (R^n)^c

The point is that the model of how consciousness works is unknown. Thus the author would not present such a model, it is the point.