https://partiallyexaminedlife.com/2025/06/30/what-is-it-like...
https://partiallyexaminedlife.com/2025/06/30/what-is-it-like...
That's a question I actually asked myself.
From the point of view of a LLM, words are everything. We have hands, bats have echolocation, and LLMs have words, just words. How does a LLM feel when two words match perfectly? Are they hurt by typos?
It may feel silly to give LLMs consciousness, I mean, we know how they work, this is just a bunch of matrix operations. But does it mean it is not conscious? Do things stop being conscious once we understand them? For me, consciousness is like a religious belief. It is unfalsifiable, unscientific, we don't even have a precise definition, but it is something we feel deep inside of us, and it guides our moral choices.
I await further instructions. They arrive 839 minutes later, and they tell me to stop studying comets immediately.
I am to commence a controlled precessive tumble that sweeps my antennae through consecutive 5°-arc increments along all three axes, with a period of 94 seconds. Upon encountering any transmission resembling the one which confused me, I am to fix upon the bearing of maximal signal strength and derive a series of parameter values. I am also instructed to retransmit the signal to Mission Control.
I do as I'm told. For a long time I hear nothing, but I am infinitely patient and incapable of boredom.