←back to thread

What is it like to be a bat?

(en.wikipedia.org)
180 points adityaathalye | 4 comments | | HN request time: 1.044s | source
Show context
scubakid ◴[] No.45119000[source]
To me, "what is it like to be a" is more or less the intersection of sensory modalities between two systems... but I'm not sure the extent of the overlap tells you much about whether a given system is "conscious" or not.
replies(2): >>45119275 #>>45119406 #
1. kelseyfrog ◴[] No.45119275[source]
Pretty much the same conclusion here. Consciousness is what we feel when sheaf 1-cohomology among our different senses vanishes.

Bringing it back to bats, a failure to imagine what it's like to be a bat is just indicative that the overlaps between human and bat modalities don’t admit a coherent gluing that humans can inhabit phenomenally.

replies(1): >>45120924 #
2. ants_everywhere ◴[] No.45120924[source]
> Pretty much the same conclusion here. Consciousness is what we feel when sheaf 1-cohomology among our different senses vanishes.

There's something more to it than this.

For one thing there's a threshold of awareness. Your mind is constantly doing things and having thoughts that don't arrive to the threshold of awareness. You can observe more of this stuff if you meditate and less of this stuff if you constantly distract yourself. But consciousness IMO should have the idea of a threshold baked in.

For another, the brain will unify things that don't make sense. I assume you mean something like consciousness is what happens when there aren't obstructions to stitching sensory data together. But the brain does a lot of work interpreting incoherent data as best it can. It doesn't have to limit itself to coherent data.

replies(1): >>45121103 #
3. kelseyfrog ◴[] No.45121103[source]
I'll have to reflect more on the first part, but as far as

> It doesn't have to limit itself to coherent data.

There are specific failure cases for non-integrability:

1. Dissociation/derealization = partial failures of gluing.

2. Nausea = inconsistent overlaps (ie: large cocycles) interpreted as bodily threat.

3. Anesthesia = disabling of the sheaf functor: no global section possible.

At least for me it provides a consistent working model for hallucinogenic, synesthesia, phantom limb phenomena, and split-brain scenarios. If anything, the ways in which sensor integration fails are more interesting than when it succeeds.

replies(1): >>45121267 #
4. ants_everywhere ◴[] No.45121267{3}[source]
Yeah to be clear I like this mental model a lot, and I give it extra points for invoking sheaf theory :). I was just saying it doesn't seem complete to me from a psychological perspective.

The way I look at it is that the sensors provide data as activations and awareness is some output with a thresholding or activation function.

Sense making and consciousness in my mental model is something that happens after the fact and it tries to happen even with nonsense data. As opposed to -- as I was reading you to be leaning toward -- being the consequence of sensory data being in a sufficiently nice relationship with each other.