Somebody used this paper to make the term batfished, which they defined as being fooled into ascribing subjectivity to a non-sentient actor (i.e. an AI).
https://partiallyexaminedlife.com/2025/06/30/what-is-it-like...
replies(3):
https://partiallyexaminedlife.com/2025/06/30/what-is-it-like...
That's a question I actually asked myself.
From the point of view of a LLM, words are everything. We have hands, bats have echolocation, and LLMs have words, just words. How does a LLM feel when two words match perfectly? Are they hurt by typos?
It may feel silly to give LLMs consciousness, I mean, we know how they work, this is just a bunch of matrix operations. But does it mean it is not conscious? Do things stop being conscious once we understand them? For me, consciousness is like a religious belief. It is unfalsifiable, unscientific, we don't even have a precise definition, but it is something we feel deep inside of us, and it guides our moral choices.