Most active commenters
  • glenstein(8)
  • HarHarVeryFunny(6)
  • ars(4)
  • NoMoreNicksLeft(4)
  • GoblinSlayer(4)
  • scott_w(3)

←back to thread

What is it like to be a bat?

(en.wikipedia.org)
180 points adityaathalye | 56 comments | | HN request time: 1.148s | source | bottom
1. mistidoi ◴[] No.45119208[source]
Somebody used this paper to make the term batfished, which they defined as being fooled into ascribing subjectivity to a non-sentient actor (i.e. an AI).

https://partiallyexaminedlife.com/2025/06/30/what-is-it-like...

replies(3): >>45120663 #>>45121301 #>>45126585 #
2. nsriv ◴[] No.45120663[source]
I love this, hope it takes off like "enhsittification" or "slop" have already.
replies(3): >>45120848 #>>45120970 #>>45123246 #
3. ants_everywhere ◴[] No.45120848[source]
I'll add it to my anti-AI bingo card
4. IshKebab ◴[] No.45120970[source]
Uhgh "slop" is ok but "enshittification" was lame from the start.
replies(1): >>45121528 #
5. HarHarVeryFunny ◴[] No.45121301[source]
Nagel's "What is it like to be a bat?" assumes that bats are conscious, and that the question of what is the subjective experience of being a bat (e.g. what does the sense of echolocation feel like) is therefore a meaningful question to ask.

The author inventing "batfished" also believes bats to be conscious, so it seems a very poorly conceived word, and anyways unnecessary since anthropomorphize works just fine... "You've just gaslighted yourself by anthropomorphizing the AI".

replies(2): >>45121791 #>>45122543 #
6. parpfish ◴[] No.45121528{3}[source]
Not only is it a terrible term, but it describes a concept that isn’t really worthy of having its own term. It’s really just a way of saying “people will make things worse over time”
replies(4): >>45121600 #>>45121603 #>>45122869 #>>45126475 #
7. dmurray ◴[] No.45121600{4}[source]
No! Enshittification has a precise meaning, about how people will make things worse over time after making them good.

Mostly people make things better over time. My bed, my shower, my car are all better than I could reasonably have bought 50 years ago. But the peculiarities of software network effects - or of what venture capitalists believe about software network effects - mean that people should give things away below cost while continuing to make them better, and then one day switch to selling them for a profit and making them worse, while they seemingly could change nothing and not make them worse.

That's a particular phenomenon worthy of a name and the only problem with "enshittification" is that it's been co-opted to mean making things worse in general.

replies(1): >>45121950 #
8. guerrilla ◴[] No.45121603{4}[source]
That isn't what it means though. It means specifically that companies will make products and services worse over time for profit.
9. TinkersW ◴[] No.45121791[source]
There isn't even a definitive definition of conscious, but you are somehow positive that bats don't possess it..
replies(4): >>45122418 #>>45122600 #>>45123273 #>>45124597 #
10. cyberax ◴[] No.45121950{5}[source]
> or of what venture capitalists believe about software network effects

It's not always that. After some time, software gets to a state where it's near the local maximum for usability. So any changes make the software _less_ usable.

But you don't get promoted in large tech companies unless you make changes. So that's how we get stuff like "liquid glass" or Android's UI degradatation.

11. HarHarVeryFunny ◴[] No.45122418{3}[source]
You're right that it doesn't make any sense to talk about it without defining it, but I'd say that consciousness is based on your brain having access to parts of itself internally, not just the outside word, and that bats presumably do have it.
replies(1): >>45123770 #
12. glenstein ◴[] No.45122543[source]
I understand that we may not have demonstrated to a level of absolutely provable certainty that bats are definitely conscious, but they are very powerful intuitive reasons for believing they are to the point that I that I'm not particularly concerned about this being a weak link in any philosophical musing on consciousness.
replies(3): >>45123783 #>>45124839 #>>45130914 #
13. glenstein ◴[] No.45122600{3}[source]
I've said this before, but you can't, and honestly don't need to start from definitions to be able to do meaningful research and have meaningful conversations about consciousness (though it certainly would be preferable to have one rather than not have one).

There are many research areas where the object of research is to know something well enough that you could converge on such a thing as a definition, e.g. dark matter, intelligence, colony collapse syndrome, SIDS. We nevertheless can progress in our understanding of them in a whole motley of strategic ways, by case studies that best exhibit salient properties, trace the outer boundaries of the problem space, track the central cluster of "family resemblances" that seem to characterize the problem, entertain candidate explanations that are closer or further away, etc. Essentially a practical attitude.

I don't doubt in principle that we could arrive at such a thing as a definition that satisfies most people, but I suspect you're more likely to have that at the end than the beginning.

replies(1): >>45150249 #
14. jmbwell ◴[] No.45122869{4}[source]
It’s about the change from endeavoring to produce a product people want regardless of profit, to making profit regardless of what people want.
15. astrange ◴[] No.45123246[source]
"Enshittification" is too twee.

You can tell it was invented by Cory Doctorow because there is a very specific kind of Gen X person who uses words like that - they have a defective sense of humor vaguely based on Monty Python, never learned when you are and aren't supposed to turn it off, and so they insist on making up random insults like "fuckwaffle" all the time instead of regular swearing.

replies(2): >>45123716 #>>45124315 #
16. ars ◴[] No.45123273{3}[source]
Consciousness is awareness of yourself, and then the ability to look at yourself and decide to make a change.

Someone conscious is able to choose how they want to behave and then behave that way. For example I can choose to be kind or mean. I can choose to learn to skate or I choose not to.

So free will and consciousness are strongly linked.

I have seen zero evidence that any other being other than humans can do this. All other animals have behaviors that are directly shaped by their environment, physical needs, and genetic temperament, and not at all shaped by choices.

For example a dog that likes to play with children simply likes them, it did not choose to like them. I on the other hand can sit, think, and decide if I like kids or not.

(This does not imply that all choices made by humans are conscious - in fact most are not, it just means that humans can do that.)

replies(2): >>45123602 #>>45126354 #
17. goopypoop ◴[] No.45123602{4}[source]
Some animals show choices - see e.g. the mirror test.

On the other hand, I bet you can't prove that you ever made a free choice.

replies(1): >>45123843 #
18. goopypoop ◴[] No.45123716{3}[source]
it's more cromulent than cockwomble
19. markhahn ◴[] No.45123770{4}[source]
I find that people who complain about "defining" consciousness are, in fact, Mysterians who opposed the very idea of such a definition.

All we need to do (to talk about, to study it) is identify it. We need to be using the word to refer to the same thing. And there's nothing really hard about that.

20. NoMoreNicksLeft ◴[] No.45123783{3}[source]
>I understand that we may not have demonstrated to a level of absolutely provable certainty that bats are definitely conscious, but they

We haven't even demonstrated some modest evidence that humans are conscious. No one has bothered to put in any effort to define consciousness in a way that is empirically/objectively testable. It is a null concept.

replies(3): >>45123820 #>>45123890 #>>45131107 #
21. goatlover ◴[] No.45123820{4}[source]
Qualia is the philosophical term for subjective sensations and feelings. It's what our experiences consist of. Why must a concept be empirical and objective? Logical positivism is flawed because the principle of verification cannot be verified.

Nagel's paper deals with the fundamental divide between subjectivity and objectivity. That's the point of the bat example. We know there are animals that have sensory capabilities we don't. But we don't know what the resulting sensations are for those creatures.

replies(3): >>45124576 #>>45124698 #>>45130935 #
22. ars ◴[] No.45123843{5}[source]
You are simultaneously claiming you can prove an animal made a choice, but I didn't? That's a contradiction.

In any case, a mirror test is a test of recognizing self, it does not indicate anything in terms of self awareness.

And I chose to fast for 5 days because I wanted to. Nothing forced me, it was a free choice. I simply thought about it and decided to do it, there were no pro's or con's pushing me in either direction.

replies(1): >>45124954 #
23. phreeza ◴[] No.45123890{4}[source]
There are attempts at a quantifitative definition of consciousness, for example https://en.m.wikipedia.org/wiki/Integrated_information_theor...
24. mock-possum ◴[] No.45124315{3}[source]
It smells of penny arcade
25. GoblinSlayer ◴[] No.45124576{5}[source]
>Logical positivism is flawed because the principle of verification cannot be verified.

Why not? It works, thus it verifies itself.

replies(2): >>45126335 #>>45128813 #
26. GoblinSlayer ◴[] No.45124597{3}[source]
Consciousness is software. You can imagine what AbstractFactoryProviderFactory is like. Is it the same inside computer? If not, then what was imagined?
27. scott_w ◴[] No.45124698{5}[source]
> Why must a concept be empirical and objective?

Because otherwise it's your word against mine and, since we both probably have different definitions of consciousness, it's hard to have a meaningful debate about whether bats, cats, or AI have consciousness.

I'm reminded of a conversation last year where I was accused of "moving the goalposts" in a discussion on AI because I kept pointing out differences between artificial and human intelligence. Such an accusation is harder to make when we have a clearly defined and measurable understanding of what things like consciousness and intelligence are.

28. visarga ◴[] No.45124839{3}[source]
> I understand that we may not have demonstrated to a level of absolutely provable certainty that bats are definitely conscious

We have not proven "to a level of absolutely provable certainty" that other humans are also conscious. You can only tell you are conscious yourself, not others. The whole field of consciousness is based on analyzing something for which we have sample size n=1.

They say "because of similar structure and behavior" we infer others are also conscious. But that is a copout, we are supposed to reject behavioral and structural arguments (from 3rd person) in discussion about consciousness.

Not only that, but what would be an alternative to "it feels like something?" - we can't imagine non-experience, or define it without negation. We are supposed to use consciousness to prove consciousness while we can't even imagine non-consciousness except in an abstract, negation-based manner.

Another issue I have with the qualia framing is that nobody talks about costs. It costs oxygen and glucose to run the brain. It costs work, time, energy, materials, opportunity and social debt to run it. It does not sit in a platonic world.

replies(2): >>45130497 #>>45152686 #
29. scott_w ◴[] No.45124954{6}[source]
> Some animals show choices

They said animals show choices, they did not claim to prove animals made a choice. The point is that you also cannot prove you made a choice, only that you do things that show you may have made a choice. It's a fine, but important, distinction.

replies(1): >>45129841 #
30. rcxdude ◴[] No.45126335{6}[source]
So do an infinite number of sets of statements which include a false one. Circular arguments are obviously not reliable.
replies(1): >>45127397 #
31. rcxdude ◴[] No.45126354{4}[source]
I don't think this matches which most people's definition of consciousness. The ability to decide rarely enters into such conversations.
replies(1): >>45129777 #
32. jmcmichael ◴[] No.45126475{4}[source]
“Enshittification” updates for the modern era the concept of enclosure, where a common resource that was formerly open and free to contributors is progressively controlled, restricted, or diminished to increase private profits.
33. GuB-42 ◴[] No.45126585[source]
> What is it like to be an LLM?

That's a question I actually asked myself.

From the point of view of a LLM, words are everything. We have hands, bats have echolocation, and LLMs have words, just words. How does a LLM feel when two words match perfectly? Are they hurt by typos?

It may feel silly to give LLMs consciousness, I mean, we know how they work, this is just a bunch of matrix operations. But does it mean it is not conscious? Do things stop being conscious once we understand them? For me, consciousness is like a religious belief. It is unfalsifiable, unscientific, we don't even have a precise definition, but it is something we feel deep inside of us, and it guides our moral choices.

replies(3): >>45127344 #>>45127697 #>>45132232 #
34. Mumps ◴[] No.45127344[source]
You activated a memory of a passage in one of my favourite books ( Blindsight, Peter Watts. it's amazing and free online):

I await further instructions. They arrive 839 minutes later, and they tell me to stop studying comets immediately.

I am to commence a controlled precessive tumble that sweeps my antennae through consecutive 5°-arc increments along all three axes, with a period of 94 seconds. Upon encountering any transmission resembling the one which confused me, I am to fix upon the bearing of maximal signal strength and derive a series of parameter values. I am also instructed to retransmit the signal to Mission Control.

I do as I'm told. For a long time I hear nothing, but I am infinitely patient and incapable of boredom.

35. GoblinSlayer ◴[] No.45127397{7}[source]
That's a hypothesis of a counterexample, though, not a fact of a counterexample.
36. nurettin ◴[] No.45127697[source]
> Are they hurt by typos?

I've been thinking about that. Would they perform worse if I misspell a word along the way?

It looks like even the greatest models of 2025 are utterly confused by everything when you introduce two contradicting requirements, so they definitely "dislike" that.

37. goatlover ◴[] No.45128813{6}[source]
It's not working if it excludes subjective experience by definition. Makes it useless for the consciousness debate.
replies(1): >>45129074 #
38. GoblinSlayer ◴[] No.45129074{7}[source]
It doesn't exclude subjective experience.
39. ars ◴[] No.45129777{5}[source]
If you can't do anything with your self awareness it's equivalent to not having it.

What's the distinction between knowing I exist, but all my actions are pre-programmed vs not knowing I exist? You're essentially describing a detached observer, who watches their own body do stuff without influencing it.

The whole point of being conscious is being aware of yourself, and then using that awareness to direct your actions.

I had no idea people even had another definition, I can't figure out how else you could even define it.

replies(2): >>45137407 #>>45150671 #
40. ars ◴[] No.45129841{7}[source]
Did I, or did I not, have the option to fast, or not to fast?

Did I then pick one? How is that not proof of a choice? Who or what else made that choice if not me?

If you poke me with a needle, I move, that is not a choice because it's a forced choice, that's essentially what animals do, all their choices are forced.

That's also what free will is, free will is not a choice between a good and bad options - that's not a choice. Free will is picking between two options that are equal, and yet different (i.e. not something where before options are more or less the same, like go left or right more or less randomly).

Free will is only rarely exercised in life, most choices are forced or random.

> They said animals show choices

Given what I wrote, do they actually show choices? Or do they just pick between good/bad or two equal options?

replies(2): >>45130062 #>>45144136 #
41. scott_w ◴[] No.45130062{8}[source]
> Did I, or did I not, have the option to fast, or not to fast?

It looks like you had an option but it’s not possible to truly know whether you had an option. I’m not in your head so I can’t know. If, under the same circumstances and same state of mind, you perform the same action 100% of the time, did you really make a choice? Or did you just follow your programming?

42. glenstein ◴[] No.45130497{4}[source]
>We have not proven "to a level of absolutely provable certainty" that other humans are also conscious

Sure, it's not proven, it just has overwhelmingly strong empirical and intuitive reasons for being most likely true, which is the most we can say while still showing necessary humility about limits of knowledge.

You seem to treat this like it presents a crisis of uncertainty, wheras I think it's exactly the opposite, and in fact already said as much with respect to bats. Restating the case in human terms, from my perspective, is reaffirming that there's no problem here.

>we are supposed to reject behavioral and structural arguments (from 3rd person) in discussion about consciousness.

Says who? That presupposes that consciousness is already of a specific character before the investigation is even started, which is not an empirical attitude. And as I noted in a different comment, we have mountains of empirical evidence from the outside about necessary physical conditions for consciousness to the point of being able to successfully predict internal mental states. Everything from psychedelic drugs to sleep to concussions to brain to machine interfaces to hearing aides to lobotomies to face recognition research gives us evidence of the empirical world interfacing with conscious states in important ways that rely on physical mechanisms.

Similarity in structure and behavior are excellent reasons for having a provisional attitude in favor of consciousness of other creatures for all the usual reasons empirical attitudes work and are capable of being predictive that we're familiar with from their application in

"But consciousness is different" you say. Well it could be, that that's a matter for investigating, not something to be definitionally pre-supposed based on vibes.

>Not only that, but what would be an alternative to "it feels like something?"

It not feeling like something, for one. So, inert objects that aren't alive, possibly vegetative states, blackouts from concussions or drugs, p-zombies, notions of mind that attempt to define away qualia and say it's all "information processing" (with no specific commitments to that feeling like something), possibly some variations of psychedelic feeling that emphasize transcendent sense of oneness with the universe. But fundamentally, it's an affirmative assertion of it feeling like something, in contrast to noncommital positions on the question, which is a meaningful point rather than something trivially true due to a definitional necessity.

>Another issue I have with the qualia framing is that nobody talks about costs. It costs oxygen and glucose to run the brain. It costs work, time, energy, materials, opportunity and social debt to run it. It does not sit in a platonic world.

That would seem to run contrary to the point you were making above about it not being inferrable from phenomena characterized in the third person. You can't argue that third person descriptions of structures that seem necessary for consciousness are a "cop out" and then turn around and say you know it "costs" things expressed in those same third person terms. Like you said before, your position seems to be that you only know you are conscious, so you don't even know if other people are conscious at all let alone that they need such things as work, time, oxygen, or glucose. Pointing to those is a cop-out, right?

43. NoMoreNicksLeft ◴[] No.45130935{5}[source]
>Why must a concept be empirical and objective?

You are an LLM that is gibbering up hallucinations. I have no need for those.

>Nagel's paper deals with the fundamental divide between subjectivity and objectivity. That's the point of the bat example.

There is no point to it. It is devoid of insight. This happens when someone spends too many years in the philosophy department of the university, they're training themselves to believe the absurd proposition that they think profound thoughts. You live in an objective universe and any appearance to the contrary is an illusion caused by imperfect cognition.

>But we don't know what the resulting sensations are for those creatures.

Not that it would offer any secret truths, but the ability to "sense" where objects are roughly, in 3d space, with low resolution and large margins of error, and narrow directionality... most of the people reading this comment would agree that they know what that feels like if they thought about it for a few seconds. That's just not insightful. Only a dimwit with little imagination could bother to ask the question "what is it like to be a bat", but it takes a special kind of grandiosity to think that the dimwit question marks them a genius.

replies(1): >>45132018 #
44. glenstein ◴[] No.45131107{4}[source]
Said this in a different comment but I want to paste it here as well, since a lot of people seem to think "we don't even have a definition" is a show-stopping smackdown. But it isn't.

You can't, and honestly don't need to start from definitions to be able to do meaningful research and have meaningful conversations about consciousness (though it certainly would be preferable to have one rather than not have one).

There are many research areas where the object of research is to know something well enough that you could converge on such a thing as a definition, e.g. dark matter, intelligence, colony collapse syndrome, SIDS. We nevertheless can progress in our understanding of them in a whole motley of strategic ways, by case studies that best exhibit salient properties, trace the outer boundaries of the problem space, track the central cluster of "family resemblances" that seem to characterize the problem, entertain candidate explanations that are closer or further away, etc. Essentially a practical attitude.

I don't doubt in principle that we could arrive at such a thing as a definition that satisfies most people, but I suspect you're more likely to have that at the end than the beginning.

replies(1): >>45138203 #
45. glenstein ◴[] No.45132018{6}[source]
>Not that it would offer any secret truths, but the ability to "sense" where objects are roughly, in 3d space, with low resolution and large margins of error, and narrow directionality... most of the people reading this comment would agree that they know what that feels like if they thought about it for a few seconds.

I don't think that's quite right. It's convenient that bats are the example here, because they build out their spacial sense of the world primarily via echolocation whereas humans (well, with some exceptions), do it visually. Snakes can infer directionality from heat signatures with their forked tongue, and people can do it with a fascinating automatic mechanism built into the brain that compares subtle differences in frequency from the left and right ears, keeping the data to itself but kicking the sense of direction "upstairs" into conscious awareness. There are different sensory paths to the same information, and evolution may be capable of any number of qualitative states unlike the ones we're familiar with.

Some people here even seem to think that consciousness is "basic" in a way that maps onto nothing empirical at all, which, if true, opens the pandoras box to any number of modes of being. But the point of the essay is to contrast this idea to other approaches to consciousness that are either (1) non-commital, (2) emphasize something else like "self awareness" or abstract reasoning, or (3) are ambiently appreciative of qualitative states but don't elevate them to fundamental or definitional necessity the way it's argued for in the essay.

The whole notion of a "hard" problem probably can be traced to this essay, which stresses that explanations need to be more than pointing to empirical correlates. In a sense I think the point is obvious, but I also think it's a real argument because it's contrasting that necessity to a non-commmital stance that I think is kind of a default attitude.

46. edgineer ◴[] No.45132232[source]
Not words. Tokens.
47. HarHarVeryFunny ◴[] No.45137407{6}[source]
Consciousness and free will are two different things. Free will is an illusion - basic physics should tell you that the molecules in your head can no better bend the laws of physics than the molecules in a brick.

Our brains are all about prediction - ability to predict (based on past experience) what will happen in the future (e.g. if I go to X I will find water) which is a massive evolutionary advantage over just reacting to the present like an insect or perhaps a fish.

Consciousness either evolved for a reason, or comes for free with any brain-like cognitive architecture. It's based on the brain having connections giving it access to its internal states (thereby giving us the ability to self-observe), not just sensory inputs informing it about the external world. The evolutionary value of consciousness would be to be able to better predict based on the brain having access to its internal states, but as noted it may "come for free" with any kind of bird or mammal like brain - hard to imagine a brain that somehow does NOT have access to it's own internal states, and would therefore NOT be able to process/predict those using it's cognitive apparatus (lacking in something like an LLM) just as it does external sensory inputs.

Of course consciousness (ability of the brain to self-observe) is responsible for the illusion of having free will, since the brain naturally correlates it's internal pre-action planning ("I'm choosing between A or B ..." etc) with any subsequent action, but that internal planning/choice is of course all a function of brain wiring, not some mystical "free will" coming in and bending the laws of physics.

You and your dog both are conscious and both experience the illusion of free will.

48. NoMoreNicksLeft ◴[] No.45138203{5}[source]
Dark matter is easily defined as "mass that cannot be detected by the current technology except that it affects the gravitation of galaxies". It is a detectable phenomenon. It is a measurable phenomenon.

Not having a definition is the show-stopping smackdown you say it is not. You are not a conscious being, there is no such thing as consciousness. You believe in an uninteresting illusion that you cannot detect or measure.

replies(1): >>45139020 #
49. glenstein ◴[] No.45139020{6}[source]
That's not a real definition, that's a placeholder for effects downstream of the real thing that isn't yet defined, the very kind of working definition I was talking about to begin with. We still don’t know if it’s WIMPs, axions, modifications of gravity, or something else entirely. If we do figure that out its something like those, that would be the definition, and you would be able to tell the difference between that and the thing you are presently calling a definition.

And, thankfully, a future physicist would not dismiss that out of hand because they would appreciate it's utility as a working definition while research was ongoing.

replies(1): >>45163161 #
50. backscratches ◴[] No.45144136{8}[source]
Whether you had the option to fast or not and made a choice to fast is up for debate. The leading hypothesis point to no: you did not have an option and you did not make a choice.

Some time ago you heard about fasting (you did not invent fasting) and the situation in your life became such that fasting was what you naturally were compelled to do (stress, digestion, you know better than I that you did not simply decide to fast free of any influence). Your "free will" is probably a fairy tale you tell yourself to feel better about your automaton existence.

51. HarHarVeryFunny ◴[] No.45150249{4}[source]
I think one of the main problems with defining consciousness (or rather being more specific when you use that word) is that (at least in the English language) it's a heavily overloaded word and is used to refer to a bunch of only loosely related, as well as totally unrelated, things! Also, most people aren't used to teasing apart consciousness-adjacent concepts like consciousness itself (essentially self-observation) and things like free will and sensory qualia, since these are less part of everyday discussion.

It's not so much that consciousness itself is mysterious or hard to define, but rather that the word itself, in common usage, just means different things to different people. It'd perhaps be better to make up a brand new baggage-free word, with a highly specific defined meaning (ability to self-observe), when talking about consciousness related to AI.

Free-will and qualia when separated out as concepts don't seem problematic as part of a technical vocabulary since they are already well defined.

replies(1): >>45162601 #
52. HarHarVeryFunny ◴[] No.45150671{6}[source]
> The whole point of being conscious is being aware of yourself, and then using that awareness to direct your actions.

Well,

1) You are making the massive, and quite likely incorrect, assumption that consciousness evolved by itself for a purpose - that it does have a "point". It may well be that consciousness - ability to self-observe - is just a natural side effect of having a capable bird- or mammal-like brain, and talking about the "point" of consciousness therefore makes no sense. It'd be like asking what is the functional point of a saucepan making a noise when you hit it.

2) Notwithstanding 1), being self-aware (having cognitive access to your internal thoughts) does have a value, in that it allows your brain to then utilize it's cognitive abilities to make better decisions ("should I walk across that frozen pond, or maybe better not?"), but this bringing-to-bear of learned experience to make better decisions is still a 100% mechanical process. Your brain is making a "decision" (i.e. predicting a motor cortex output that may make you move or do something), but this isn't "free will" - it's just the survival benefit of a brain evolved to predict. You as an organism in the environment may be seen by an outside observer to be making smart "decisions", but these decisions aren't some mystical "free will" but rather just a highly evolved organism making good use of past experience to survive.

53. HarHarVeryFunny ◴[] No.45152686{4}[source]
> We have not proven "to a level of absolutely provable certainty" that other humans are also conscious. You can only tell you are conscious yourself, not others. The whole field of consciousness is based on analyzing something for which we have sample size n=1.

That sounds like you are talking about subjective experience, qualia of senses and being, rather than consciousness (ability to self-observe), unless you are using "consciousness" as catch-all term to refer to all of the above (which is the problem with discussing consciousness - it's an overloaded ill-defined word, and people don't typically define what they are actually talking about).

If we make this distinction between consciousness, defined as ability to self-observe, and subjective qualia (what something feels like), then it seems there is little reason to doubt that others reporting conscious awareness really are aware of what they are reporting, and anyways given common genetics and brain anatomy it'd be massively unexpected if one (healthy) person had access to parts of their internal state and others didn't.

> Not only that, but what would be an alternative to "it feels like something?" - we can't imagine non-experience, or define it without negation. We are supposed to use consciousness to prove consciousness while we can't even imagine non-consciousness except in an abstract, negation-based manner.

Perhaps the medical condition of "blindsight" gives some insight - where damage to the visual cortex can result in people having some proven visual ability but no conscious awareness of it. They report themselves as blind, but can be tasked with walking down a cluttered corridor and manage to navigate the obstacles nonetheless. They have lost visual consciousness due to brain damage, but retain at least some level of vision.

replies(1): >>45162499 #
54. glenstein ◴[] No.45162499{5}[source]
>That sounds like you are talking about subjective experience, qualia of senses and being, rather than consciousness (ability to self-observe), unless you are using "consciousness" as catch-all term

While I have a lot of problems with their comment (which I elaborated on in a reply of my own), I don't think that using it as a catch-all term is a problem (to the extent that they would agree with that characterization). In fact, I think it's truer to the spirit of the problem than the definition that you're offering. I think a lot of times when people make the objection that we haven't defined it, they're not just saying we haven't selected from one of several available permutations, I take it to mean that there's a fundamental sense in which the idea itself hasn't agreeably crystallized into a definition, which among other things, is a meta question about which of the competing definitions is the right one to use.

I do think there is a tension in that position, because it creates a chicken and egg problem where you can't research it until you define it, but you can't define it until you research it. But I think there's a way out of it by treating them in as integrally related, and taking a practical attitude of believing in the possibility of progress without yet having a final answer in hand.

I understand that this notion of self-reflecting for some people is key, but I think choosing to prioritize other things can be for good reasons rather than, as you seem to be contending, having accidentally skipped the step of selecting a preferred definition from a handful of alternatives, and not having selected the best one. My feeling is much closer to that of the article, at least in a certain way, which is about the fact that there's "something it's like to be" at all, prior to the question of whether there's self-reflection.

In fact, I'd be curious to know what you call the mental state of being for such things as creatures with a kind of outwardly directed awareness of world, with qualia, with "something it's like to be", but which fall short of having self-reflective mental states. Because if your term for such things is that they don't involve consciousness I think it's not the GP who is departing from appropriate definitions. And if self-reflection is necessarily implied in the having of such things as qualia, then you could say it's implicitly accounted for by someone who wants to talk about qualia.

55. glenstein ◴[] No.45162601{5}[source]
I certainly agree with your assessment that it's got multiple definitions and intuitions of all sorts, and that they cause chaos, and that one way of making progress is by isolating those out and clarifying.

I'm not sure I agree with this idea that the essence of consciousness is self-reflection, because that seems to exclude important things. It seems like there might be simple states of being that involve some kind of phenomenal (in the philosophy sense) experience, some amount of qualia, some amount of outwardly directed awareness, some amount of "something it's like to be". And it seems to me that there might be life forms for whom there's an engagement with an interaction with that phenomena that involves having internal mental states, but that might not necessarily have self-reflection. It might be something closer to instinctual functioning or response to incentive. It recently blew my mind to learn that there's studies strongly suggesting honey bees are conscious. And from my perspective that raises questions for things all the way down to fruit flies. It seems like there might be a continuum a simple states through complex ones and that some of the simpler ones might not include self-awareness.

If such a thing as sense of self is necessarily implicit in such a way that satisfies that definition, anytime we talk about qualia, then it would seem to be a moot point. Which raises another issue, which is that some of these things might be correctly regarded as entangled, and having an integral relation between them.

I also think I kind of agree and disagree about qualia being well defined. I think it's probably the closest to what most people have in mind when they say there's no such thing as a definition of consciousness. And I think it's a sense of despair toward the broader research project of tying an understanding of qualia to an understanding of the physical world that relies on third-person descriptions.

Now all of that said you seem like you don't have an attitude of treating the definition problem as one that preempts and stops everything, so there's a pretty fundamental way in which I'm probably in agreement with you more than disagreement. I think that clarifications of the type that you're talking about give us everything we need to iterate forward in a constructive way in talking about it and researching it.

56. NoMoreNicksLeft ◴[] No.45163161{7}[source]
>That's not a real definition, that's a placeholder for

Blah blah blah blahblah. If you can give me a definition even as poor as the one I gave for dark matter, that's all we're asking for. We don't need an explanation of the mechanism, we only need a way to measure the phenomenon. But you can't even do that.