Most active commenters
  • glenstein(6)
  • NoMoreNicksLeft(4)
  • GoblinSlayer(3)

←back to thread

What is it like to be a bat?

(en.wikipedia.org)
180 points adityaathalye | 20 comments | | HN request time: 0.001s | source | bottom
Show context
mistidoi ◴[] No.45119208[source]
Somebody used this paper to make the term batfished, which they defined as being fooled into ascribing subjectivity to a non-sentient actor (i.e. an AI).

https://partiallyexaminedlife.com/2025/06/30/what-is-it-like...

replies(3): >>45120663 #>>45121301 #>>45126585 #
HarHarVeryFunny ◴[] No.45121301[source]
Nagel's "What is it like to be a bat?" assumes that bats are conscious, and that the question of what is the subjective experience of being a bat (e.g. what does the sense of echolocation feel like) is therefore a meaningful question to ask.

The author inventing "batfished" also believes bats to be conscious, so it seems a very poorly conceived word, and anyways unnecessary since anthropomorphize works just fine... "You've just gaslighted yourself by anthropomorphizing the AI".

replies(2): >>45121791 #>>45122543 #
1. glenstein ◴[] No.45122543[source]
I understand that we may not have demonstrated to a level of absolutely provable certainty that bats are definitely conscious, but they are very powerful intuitive reasons for believing they are to the point that I that I'm not particularly concerned about this being a weak link in any philosophical musing on consciousness.
replies(3): >>45123783 #>>45124839 #>>45130914 #
2. NoMoreNicksLeft ◴[] No.45123783[source]
>I understand that we may not have demonstrated to a level of absolutely provable certainty that bats are definitely conscious, but they

We haven't even demonstrated some modest evidence that humans are conscious. No one has bothered to put in any effort to define consciousness in a way that is empirically/objectively testable. It is a null concept.

replies(3): >>45123820 #>>45123890 #>>45131107 #
3. goatlover ◴[] No.45123820[source]
Qualia is the philosophical term for subjective sensations and feelings. It's what our experiences consist of. Why must a concept be empirical and objective? Logical positivism is flawed because the principle of verification cannot be verified.

Nagel's paper deals with the fundamental divide between subjectivity and objectivity. That's the point of the bat example. We know there are animals that have sensory capabilities we don't. But we don't know what the resulting sensations are for those creatures.

replies(3): >>45124576 #>>45124698 #>>45130935 #
4. phreeza ◴[] No.45123890[source]
There are attempts at a quantifitative definition of consciousness, for example https://en.m.wikipedia.org/wiki/Integrated_information_theor...
5. GoblinSlayer ◴[] No.45124576{3}[source]
>Logical positivism is flawed because the principle of verification cannot be verified.

Why not? It works, thus it verifies itself.

replies(2): >>45126335 #>>45128813 #
6. scott_w ◴[] No.45124698{3}[source]
> Why must a concept be empirical and objective?

Because otherwise it's your word against mine and, since we both probably have different definitions of consciousness, it's hard to have a meaningful debate about whether bats, cats, or AI have consciousness.

I'm reminded of a conversation last year where I was accused of "moving the goalposts" in a discussion on AI because I kept pointing out differences between artificial and human intelligence. Such an accusation is harder to make when we have a clearly defined and measurable understanding of what things like consciousness and intelligence are.

7. visarga ◴[] No.45124839[source]
> I understand that we may not have demonstrated to a level of absolutely provable certainty that bats are definitely conscious

We have not proven "to a level of absolutely provable certainty" that other humans are also conscious. You can only tell you are conscious yourself, not others. The whole field of consciousness is based on analyzing something for which we have sample size n=1.

They say "because of similar structure and behavior" we infer others are also conscious. But that is a copout, we are supposed to reject behavioral and structural arguments (from 3rd person) in discussion about consciousness.

Not only that, but what would be an alternative to "it feels like something?" - we can't imagine non-experience, or define it without negation. We are supposed to use consciousness to prove consciousness while we can't even imagine non-consciousness except in an abstract, negation-based manner.

Another issue I have with the qualia framing is that nobody talks about costs. It costs oxygen and glucose to run the brain. It costs work, time, energy, materials, opportunity and social debt to run it. It does not sit in a platonic world.

replies(2): >>45130497 #>>45152686 #
8. rcxdude ◴[] No.45126335{4}[source]
So do an infinite number of sets of statements which include a false one. Circular arguments are obviously not reliable.
replies(1): >>45127397 #
9. GoblinSlayer ◴[] No.45127397{5}[source]
That's a hypothesis of a counterexample, though, not a fact of a counterexample.
10. goatlover ◴[] No.45128813{4}[source]
It's not working if it excludes subjective experience by definition. Makes it useless for the consciousness debate.
replies(1): >>45129074 #
11. GoblinSlayer ◴[] No.45129074{5}[source]
It doesn't exclude subjective experience.
12. glenstein ◴[] No.45130497[source]
>We have not proven "to a level of absolutely provable certainty" that other humans are also conscious

Sure, it's not proven, it just has overwhelmingly strong empirical and intuitive reasons for being most likely true, which is the most we can say while still showing necessary humility about limits of knowledge.

You seem to treat this like it presents a crisis of uncertainty, wheras I think it's exactly the opposite, and in fact already said as much with respect to bats. Restating the case in human terms, from my perspective, is reaffirming that there's no problem here.

>we are supposed to reject behavioral and structural arguments (from 3rd person) in discussion about consciousness.

Says who? That presupposes that consciousness is already of a specific character before the investigation is even started, which is not an empirical attitude. And as I noted in a different comment, we have mountains of empirical evidence from the outside about necessary physical conditions for consciousness to the point of being able to successfully predict internal mental states. Everything from psychedelic drugs to sleep to concussions to brain to machine interfaces to hearing aides to lobotomies to face recognition research gives us evidence of the empirical world interfacing with conscious states in important ways that rely on physical mechanisms.

Similarity in structure and behavior are excellent reasons for having a provisional attitude in favor of consciousness of other creatures for all the usual reasons empirical attitudes work and are capable of being predictive that we're familiar with from their application in

"But consciousness is different" you say. Well it could be, that that's a matter for investigating, not something to be definitionally pre-supposed based on vibes.

>Not only that, but what would be an alternative to "it feels like something?"

It not feeling like something, for one. So, inert objects that aren't alive, possibly vegetative states, blackouts from concussions or drugs, p-zombies, notions of mind that attempt to define away qualia and say it's all "information processing" (with no specific commitments to that feeling like something), possibly some variations of psychedelic feeling that emphasize transcendent sense of oneness with the universe. But fundamentally, it's an affirmative assertion of it feeling like something, in contrast to noncommital positions on the question, which is a meaningful point rather than something trivially true due to a definitional necessity.

>Another issue I have with the qualia framing is that nobody talks about costs. It costs oxygen and glucose to run the brain. It costs work, time, energy, materials, opportunity and social debt to run it. It does not sit in a platonic world.

That would seem to run contrary to the point you were making above about it not being inferrable from phenomena characterized in the third person. You can't argue that third person descriptions of structures that seem necessary for consciousness are a "cop out" and then turn around and say you know it "costs" things expressed in those same third person terms. Like you said before, your position seems to be that you only know you are conscious, so you don't even know if other people are conscious at all let alone that they need such things as work, time, oxygen, or glucose. Pointing to those is a cop-out, right?

13. NoMoreNicksLeft ◴[] No.45130935{3}[source]
>Why must a concept be empirical and objective?

You are an LLM that is gibbering up hallucinations. I have no need for those.

>Nagel's paper deals with the fundamental divide between subjectivity and objectivity. That's the point of the bat example.

There is no point to it. It is devoid of insight. This happens when someone spends too many years in the philosophy department of the university, they're training themselves to believe the absurd proposition that they think profound thoughts. You live in an objective universe and any appearance to the contrary is an illusion caused by imperfect cognition.

>But we don't know what the resulting sensations are for those creatures.

Not that it would offer any secret truths, but the ability to "sense" where objects are roughly, in 3d space, with low resolution and large margins of error, and narrow directionality... most of the people reading this comment would agree that they know what that feels like if they thought about it for a few seconds. That's just not insightful. Only a dimwit with little imagination could bother to ask the question "what is it like to be a bat", but it takes a special kind of grandiosity to think that the dimwit question marks them a genius.

replies(1): >>45132018 #
14. glenstein ◴[] No.45131107[source]
Said this in a different comment but I want to paste it here as well, since a lot of people seem to think "we don't even have a definition" is a show-stopping smackdown. But it isn't.

You can't, and honestly don't need to start from definitions to be able to do meaningful research and have meaningful conversations about consciousness (though it certainly would be preferable to have one rather than not have one).

There are many research areas where the object of research is to know something well enough that you could converge on such a thing as a definition, e.g. dark matter, intelligence, colony collapse syndrome, SIDS. We nevertheless can progress in our understanding of them in a whole motley of strategic ways, by case studies that best exhibit salient properties, trace the outer boundaries of the problem space, track the central cluster of "family resemblances" that seem to characterize the problem, entertain candidate explanations that are closer or further away, etc. Essentially a practical attitude.

I don't doubt in principle that we could arrive at such a thing as a definition that satisfies most people, but I suspect you're more likely to have that at the end than the beginning.

replies(1): >>45138203 #
15. glenstein ◴[] No.45132018{4}[source]
>Not that it would offer any secret truths, but the ability to "sense" where objects are roughly, in 3d space, with low resolution and large margins of error, and narrow directionality... most of the people reading this comment would agree that they know what that feels like if they thought about it for a few seconds.

I don't think that's quite right. It's convenient that bats are the example here, because they build out their spacial sense of the world primarily via echolocation whereas humans (well, with some exceptions), do it visually. Snakes can infer directionality from heat signatures with their forked tongue, and people can do it with a fascinating automatic mechanism built into the brain that compares subtle differences in frequency from the left and right ears, keeping the data to itself but kicking the sense of direction "upstairs" into conscious awareness. There are different sensory paths to the same information, and evolution may be capable of any number of qualitative states unlike the ones we're familiar with.

Some people here even seem to think that consciousness is "basic" in a way that maps onto nothing empirical at all, which, if true, opens the pandoras box to any number of modes of being. But the point of the essay is to contrast this idea to other approaches to consciousness that are either (1) non-commital, (2) emphasize something else like "self awareness" or abstract reasoning, or (3) are ambiently appreciative of qualitative states but don't elevate them to fundamental or definitional necessity the way it's argued for in the essay.

The whole notion of a "hard" problem probably can be traced to this essay, which stresses that explanations need to be more than pointing to empirical correlates. In a sense I think the point is obvious, but I also think it's a real argument because it's contrasting that necessity to a non-commmital stance that I think is kind of a default attitude.

16. NoMoreNicksLeft ◴[] No.45138203{3}[source]
Dark matter is easily defined as "mass that cannot be detected by the current technology except that it affects the gravitation of galaxies". It is a detectable phenomenon. It is a measurable phenomenon.

Not having a definition is the show-stopping smackdown you say it is not. You are not a conscious being, there is no such thing as consciousness. You believe in an uninteresting illusion that you cannot detect or measure.

replies(1): >>45139020 #
17. glenstein ◴[] No.45139020{4}[source]
That's not a real definition, that's a placeholder for effects downstream of the real thing that isn't yet defined, the very kind of working definition I was talking about to begin with. We still don’t know if it’s WIMPs, axions, modifications of gravity, or something else entirely. If we do figure that out its something like those, that would be the definition, and you would be able to tell the difference between that and the thing you are presently calling a definition.

And, thankfully, a future physicist would not dismiss that out of hand because they would appreciate it's utility as a working definition while research was ongoing.

replies(1): >>45163161 #
18. HarHarVeryFunny ◴[] No.45152686[source]
> We have not proven "to a level of absolutely provable certainty" that other humans are also conscious. You can only tell you are conscious yourself, not others. The whole field of consciousness is based on analyzing something for which we have sample size n=1.

That sounds like you are talking about subjective experience, qualia of senses and being, rather than consciousness (ability to self-observe), unless you are using "consciousness" as catch-all term to refer to all of the above (which is the problem with discussing consciousness - it's an overloaded ill-defined word, and people don't typically define what they are actually talking about).

If we make this distinction between consciousness, defined as ability to self-observe, and subjective qualia (what something feels like), then it seems there is little reason to doubt that others reporting conscious awareness really are aware of what they are reporting, and anyways given common genetics and brain anatomy it'd be massively unexpected if one (healthy) person had access to parts of their internal state and others didn't.

> Not only that, but what would be an alternative to "it feels like something?" - we can't imagine non-experience, or define it without negation. We are supposed to use consciousness to prove consciousness while we can't even imagine non-consciousness except in an abstract, negation-based manner.

Perhaps the medical condition of "blindsight" gives some insight - where damage to the visual cortex can result in people having some proven visual ability but no conscious awareness of it. They report themselves as blind, but can be tasked with walking down a cluttered corridor and manage to navigate the obstacles nonetheless. They have lost visual consciousness due to brain damage, but retain at least some level of vision.

replies(1): >>45162499 #
19. glenstein ◴[] No.45162499{3}[source]
>That sounds like you are talking about subjective experience, qualia of senses and being, rather than consciousness (ability to self-observe), unless you are using "consciousness" as catch-all term

While I have a lot of problems with their comment (which I elaborated on in a reply of my own), I don't think that using it as a catch-all term is a problem (to the extent that they would agree with that characterization). In fact, I think it's truer to the spirit of the problem than the definition that you're offering. I think a lot of times when people make the objection that we haven't defined it, they're not just saying we haven't selected from one of several available permutations, I take it to mean that there's a fundamental sense in which the idea itself hasn't agreeably crystallized into a definition, which among other things, is a meta question about which of the competing definitions is the right one to use.

I do think there is a tension in that position, because it creates a chicken and egg problem where you can't research it until you define it, but you can't define it until you research it. But I think there's a way out of it by treating them in as integrally related, and taking a practical attitude of believing in the possibility of progress without yet having a final answer in hand.

I understand that this notion of self-reflecting for some people is key, but I think choosing to prioritize other things can be for good reasons rather than, as you seem to be contending, having accidentally skipped the step of selecting a preferred definition from a handful of alternatives, and not having selected the best one. My feeling is much closer to that of the article, at least in a certain way, which is about the fact that there's "something it's like to be" at all, prior to the question of whether there's self-reflection.

In fact, I'd be curious to know what you call the mental state of being for such things as creatures with a kind of outwardly directed awareness of world, with qualia, with "something it's like to be", but which fall short of having self-reflective mental states. Because if your term for such things is that they don't involve consciousness I think it's not the GP who is departing from appropriate definitions. And if self-reflection is necessarily implied in the having of such things as qualia, then you could say it's implicitly accounted for by someone who wants to talk about qualia.

20. NoMoreNicksLeft ◴[] No.45163161{5}[source]
>That's not a real definition, that's a placeholder for

Blah blah blah blahblah. If you can give me a definition even as poor as the one I gave for dark matter, that's all we're asking for. We don't need an explanation of the mechanism, we only need a way to measure the phenomenon. But you can't even do that.