Most active commenters
  • HarHarVeryFunny(5)
  • ars(4)

←back to thread

What is it like to be a bat?

(en.wikipedia.org)
180 points adityaathalye | 18 comments | | HN request time: 0s | source | bottom
Show context
mistidoi ◴[] No.45119208[source]
Somebody used this paper to make the term batfished, which they defined as being fooled into ascribing subjectivity to a non-sentient actor (i.e. an AI).

https://partiallyexaminedlife.com/2025/06/30/what-is-it-like...

replies(3): >>45120663 #>>45121301 #>>45126585 #
HarHarVeryFunny ◴[] No.45121301[source]
Nagel's "What is it like to be a bat?" assumes that bats are conscious, and that the question of what is the subjective experience of being a bat (e.g. what does the sense of echolocation feel like) is therefore a meaningful question to ask.

The author inventing "batfished" also believes bats to be conscious, so it seems a very poorly conceived word, and anyways unnecessary since anthropomorphize works just fine... "You've just gaslighted yourself by anthropomorphizing the AI".

replies(2): >>45121791 #>>45122543 #
1. TinkersW ◴[] No.45121791[source]
There isn't even a definitive definition of conscious, but you are somehow positive that bats don't possess it..
replies(4): >>45122418 #>>45122600 #>>45123273 #>>45124597 #
2. HarHarVeryFunny ◴[] No.45122418[source]
You're right that it doesn't make any sense to talk about it without defining it, but I'd say that consciousness is based on your brain having access to parts of itself internally, not just the outside word, and that bats presumably do have it.
replies(1): >>45123770 #
3. glenstein ◴[] No.45122600[source]
I've said this before, but you can't, and honestly don't need to start from definitions to be able to do meaningful research and have meaningful conversations about consciousness (though it certainly would be preferable to have one rather than not have one).

There are many research areas where the object of research is to know something well enough that you could converge on such a thing as a definition, e.g. dark matter, intelligence, colony collapse syndrome, SIDS. We nevertheless can progress in our understanding of them in a whole motley of strategic ways, by case studies that best exhibit salient properties, trace the outer boundaries of the problem space, track the central cluster of "family resemblances" that seem to characterize the problem, entertain candidate explanations that are closer or further away, etc. Essentially a practical attitude.

I don't doubt in principle that we could arrive at such a thing as a definition that satisfies most people, but I suspect you're more likely to have that at the end than the beginning.

replies(1): >>45150249 #
4. ars ◴[] No.45123273[source]
Consciousness is awareness of yourself, and then the ability to look at yourself and decide to make a change.

Someone conscious is able to choose how they want to behave and then behave that way. For example I can choose to be kind or mean. I can choose to learn to skate or I choose not to.

So free will and consciousness are strongly linked.

I have seen zero evidence that any other being other than humans can do this. All other animals have behaviors that are directly shaped by their environment, physical needs, and genetic temperament, and not at all shaped by choices.

For example a dog that likes to play with children simply likes them, it did not choose to like them. I on the other hand can sit, think, and decide if I like kids or not.

(This does not imply that all choices made by humans are conscious - in fact most are not, it just means that humans can do that.)

replies(2): >>45123602 #>>45126354 #
5. goopypoop ◴[] No.45123602[source]
Some animals show choices - see e.g. the mirror test.

On the other hand, I bet you can't prove that you ever made a free choice.

replies(1): >>45123843 #
6. markhahn ◴[] No.45123770[source]
I find that people who complain about "defining" consciousness are, in fact, Mysterians who opposed the very idea of such a definition.

All we need to do (to talk about, to study it) is identify it. We need to be using the word to refer to the same thing. And there's nothing really hard about that.

7. ars ◴[] No.45123843{3}[source]
You are simultaneously claiming you can prove an animal made a choice, but I didn't? That's a contradiction.

In any case, a mirror test is a test of recognizing self, it does not indicate anything in terms of self awareness.

And I chose to fast for 5 days because I wanted to. Nothing forced me, it was a free choice. I simply thought about it and decided to do it, there were no pro's or con's pushing me in either direction.

replies(1): >>45124954 #
8. GoblinSlayer ◴[] No.45124597[source]
Consciousness is software. You can imagine what AbstractFactoryProviderFactory is like. Is it the same inside computer? If not, then what was imagined?
9. scott_w ◴[] No.45124954{4}[source]
> Some animals show choices

They said animals show choices, they did not claim to prove animals made a choice. The point is that you also cannot prove you made a choice, only that you do things that show you may have made a choice. It's a fine, but important, distinction.

replies(1): >>45129841 #
10. rcxdude ◴[] No.45126354[source]
I don't think this matches which most people's definition of consciousness. The ability to decide rarely enters into such conversations.
replies(1): >>45129777 #
11. ars ◴[] No.45129777{3}[source]
If you can't do anything with your self awareness it's equivalent to not having it.

What's the distinction between knowing I exist, but all my actions are pre-programmed vs not knowing I exist? You're essentially describing a detached observer, who watches their own body do stuff without influencing it.

The whole point of being conscious is being aware of yourself, and then using that awareness to direct your actions.

I had no idea people even had another definition, I can't figure out how else you could even define it.

replies(2): >>45137407 #>>45150671 #
12. ars ◴[] No.45129841{5}[source]
Did I, or did I not, have the option to fast, or not to fast?

Did I then pick one? How is that not proof of a choice? Who or what else made that choice if not me?

If you poke me with a needle, I move, that is not a choice because it's a forced choice, that's essentially what animals do, all their choices are forced.

That's also what free will is, free will is not a choice between a good and bad options - that's not a choice. Free will is picking between two options that are equal, and yet different (i.e. not something where before options are more or less the same, like go left or right more or less randomly).

Free will is only rarely exercised in life, most choices are forced or random.

> They said animals show choices

Given what I wrote, do they actually show choices? Or do they just pick between good/bad or two equal options?

replies(2): >>45130062 #>>45144136 #
13. scott_w ◴[] No.45130062{6}[source]
> Did I, or did I not, have the option to fast, or not to fast?

It looks like you had an option but it’s not possible to truly know whether you had an option. I’m not in your head so I can’t know. If, under the same circumstances and same state of mind, you perform the same action 100% of the time, did you really make a choice? Or did you just follow your programming?

14. HarHarVeryFunny ◴[] No.45137407{4}[source]
Consciousness and free will are two different things. Free will is an illusion - basic physics should tell you that the molecules in your head can no better bend the laws of physics than the molecules in a brick.

Our brains are all about prediction - ability to predict (based on past experience) what will happen in the future (e.g. if I go to X I will find water) which is a massive evolutionary advantage over just reacting to the present like an insect or perhaps a fish.

Consciousness either evolved for a reason, or comes for free with any brain-like cognitive architecture. It's based on the brain having connections giving it access to its internal states (thereby giving us the ability to self-observe), not just sensory inputs informing it about the external world. The evolutionary value of consciousness would be to be able to better predict based on the brain having access to its internal states, but as noted it may "come for free" with any kind of bird or mammal like brain - hard to imagine a brain that somehow does NOT have access to it's own internal states, and would therefore NOT be able to process/predict those using it's cognitive apparatus (lacking in something like an LLM) just as it does external sensory inputs.

Of course consciousness (ability of the brain to self-observe) is responsible for the illusion of having free will, since the brain naturally correlates it's internal pre-action planning ("I'm choosing between A or B ..." etc) with any subsequent action, but that internal planning/choice is of course all a function of brain wiring, not some mystical "free will" coming in and bending the laws of physics.

You and your dog both are conscious and both experience the illusion of free will.

15. backscratches ◴[] No.45144136{6}[source]
Whether you had the option to fast or not and made a choice to fast is up for debate. The leading hypothesis point to no: you did not have an option and you did not make a choice.

Some time ago you heard about fasting (you did not invent fasting) and the situation in your life became such that fasting was what you naturally were compelled to do (stress, digestion, you know better than I that you did not simply decide to fast free of any influence). Your "free will" is probably a fairy tale you tell yourself to feel better about your automaton existence.

16. HarHarVeryFunny ◴[] No.45150249[source]
I think one of the main problems with defining consciousness (or rather being more specific when you use that word) is that (at least in the English language) it's a heavily overloaded word and is used to refer to a bunch of only loosely related, as well as totally unrelated, things! Also, most people aren't used to teasing apart consciousness-adjacent concepts like consciousness itself (essentially self-observation) and things like free will and sensory qualia, since these are less part of everyday discussion.

It's not so much that consciousness itself is mysterious or hard to define, but rather that the word itself, in common usage, just means different things to different people. It'd perhaps be better to make up a brand new baggage-free word, with a highly specific defined meaning (ability to self-observe), when talking about consciousness related to AI.

Free-will and qualia when separated out as concepts don't seem problematic as part of a technical vocabulary since they are already well defined.

replies(1): >>45162601 #
17. HarHarVeryFunny ◴[] No.45150671{4}[source]
> The whole point of being conscious is being aware of yourself, and then using that awareness to direct your actions.

Well,

1) You are making the massive, and quite likely incorrect, assumption that consciousness evolved by itself for a purpose - that it does have a "point". It may well be that consciousness - ability to self-observe - is just a natural side effect of having a capable bird- or mammal-like brain, and talking about the "point" of consciousness therefore makes no sense. It'd be like asking what is the functional point of a saucepan making a noise when you hit it.

2) Notwithstanding 1), being self-aware (having cognitive access to your internal thoughts) does have a value, in that it allows your brain to then utilize it's cognitive abilities to make better decisions ("should I walk across that frozen pond, or maybe better not?"), but this bringing-to-bear of learned experience to make better decisions is still a 100% mechanical process. Your brain is making a "decision" (i.e. predicting a motor cortex output that may make you move or do something), but this isn't "free will" - it's just the survival benefit of a brain evolved to predict. You as an organism in the environment may be seen by an outside observer to be making smart "decisions", but these decisions aren't some mystical "free will" but rather just a highly evolved organism making good use of past experience to survive.

18. glenstein ◴[] No.45162601{3}[source]
I certainly agree with your assessment that it's got multiple definitions and intuitions of all sorts, and that they cause chaos, and that one way of making progress is by isolating those out and clarifying.

I'm not sure I agree with this idea that the essence of consciousness is self-reflection, because that seems to exclude important things. It seems like there might be simple states of being that involve some kind of phenomenal (in the philosophy sense) experience, some amount of qualia, some amount of outwardly directed awareness, some amount of "something it's like to be". And it seems to me that there might be life forms for whom there's an engagement with an interaction with that phenomena that involves having internal mental states, but that might not necessarily have self-reflection. It might be something closer to instinctual functioning or response to incentive. It recently blew my mind to learn that there's studies strongly suggesting honey bees are conscious. And from my perspective that raises questions for things all the way down to fruit flies. It seems like there might be a continuum a simple states through complex ones and that some of the simpler ones might not include self-awareness.

If such a thing as sense of self is necessarily implicit in such a way that satisfies that definition, anytime we talk about qualia, then it would seem to be a moot point. Which raises another issue, which is that some of these things might be correctly regarded as entangled, and having an integral relation between them.

I also think I kind of agree and disagree about qualia being well defined. I think it's probably the closest to what most people have in mind when they say there's no such thing as a definition of consciousness. And I think it's a sense of despair toward the broader research project of tying an understanding of qualia to an understanding of the physical world that relies on third-person descriptions.

Now all of that said you seem like you don't have an attitude of treating the definition problem as one that preempts and stops everything, so there's a pretty fundamental way in which I'm probably in agreement with you more than disagreement. I think that clarifications of the type that you're talking about give us everything we need to iterate forward in a constructive way in talking about it and researching it.