←back to thread

169 points mattmarcus | 6 comments | | HN request time: 0.262s | source | bottom
Show context
EncomLab ◴[] No.43612568[source]
This is like claiming a photorestor controlled night light "understands when it is dark" or that a bimetallic strip thermostat "understands temperature". You can say those words, and it's syntactically correct but entirely incorrect semantically.
replies(6): >>43612607 #>>43612629 #>>43612689 #>>43612691 #>>43612764 #>>43612767 #
1. fallingknife ◴[] No.43612691[source]
Or like saying the photoreceptors in your retina understand when it's dark. Or like claiming the temperature sensitive ion channels in your peripheral nervous system understand how hot it is.
replies(4): >>43612763 #>>43612981 #>>43613088 #>>43613136 #
2. throwuxiytayq ◴[] No.43612763[source]
Or like saying that the tangled web of neurons receiving signals from these understands anything about these subjects.
3. nativeit ◴[] No.43612981[source]
Describing the mechanics of nervous impulses != describing consciousness.
replies(1): >>43613340 #
4. ◴[] No.43613088[source]
5. throw4847285 ◴[] No.43613136[source]
This is a fallacy I've seen enough on here that I think it needs a name. Maybe the fallacy of Theoretical Reducibility (doesn't really roll off the tongue)?

When challenged, everybody becomes an eliminative materialist even if it's inconsistent with their other views. It's very weird.

6. LordDragonfang ◴[] No.43613340[source]
Which is the point, since describing the mechanics of LLM architectures do not inherently grant knowledge of whether or not it is "conscious"