Most active commenters
  • mistercow(7)
  • NoMoreNicksLeft(5)
  • adrian_b(3)

←back to thread

The shrimp welfare project

(benthams.substack.com)
81 points 0xDEAFBEAD | 25 comments | | HN request time: 0.001s | source | bottom
Show context
n4r9 ◴[] No.42173011[source]
Apologies for focusing on just one sentence of this article, but I feel like it's crucial to the overall argument:

> ... if [shrimp] suffer only 3% as intensely as we do ...

Does this proposition make sense? It's not obvious to me that we can assign percentage values to suffering, or compare it to human suffering, or treat the values in a linear fashion.

It reminds me of that vaguely absurd thought experiment where you compare one person undergoing a lifetime of intense torture vs billions upon billions of humans getting a fleck of dust in their eyes. I just cannot square choosing the former with my conscience. Maybe I'm too unimaginative to comprehend so many billions of bits of dust.

replies(10): >>42173107 #>>42173149 #>>42173164 #>>42173244 #>>42173255 #>>42173304 #>>42173441 #>>42175565 #>>42175936 #>>42177306 #
1. mistercow ◴[] No.42173304[source]
I don’t really doubt that it’s in principle possible to assign percentage values to suffering intensity, but the 3% value (which the source admits is a “placeholder”) seems completely unhinged for an animal with 0.05% as many neurons as a chicken, and the source’s justification for largely discounting neuron counts seems pretty arbitrary, at least as presented in their FAQ.
replies(3): >>42173750 #>>42173861 #>>42175438 #
2. adrian_b ◴[] No.42173750[source]
The ratio of the neuron numbers may be somewhat meaningful when comparing vertebrates with vertebrates and arthropods with arthropods, but it is almost completely meaningless when comparing vertebrates with arthropods.

The reason is that the structure of the nervous systems of arthropods is quite different from that of the vertebrates. Comparing them is like comparing analog circuits and digital circuits that implement the same function, e.g. a number multiplier. The analog circuit may have a dozen transistors and the digital circuit may have hundreds of transistors, but they do the same thing (with different performance characteristics).

The analogy with comparing analog and digital circuits is quite appropriate, because parts of the nervous systems that have the same function, e.g. controlling a leg muscle, may have hundreds or thousands of neurons in a vertebrate, which function in an all-or-nothing manner, while in an arthropod the equivalent part may have only a few neurons that function in a much more complex manner in order to achieve fine control of the leg movement.

So typically one arthropod neuron is equivalent with much more vertebrate neurons, e.g. hundreds or even thousands.

This does not mean that the nervous system of arthropods is better than that of vertebrates. They are optimized for different criteria. Neither a vertebrate can become as small as the smallest arthropods, nor an arthropod can become as big as the bigger vertebrates, the systems that integrate the organs of a body into a single living organism, i.e. the nervous system and the circulatory and respiratory systems, are optimized for a small size in arthropods and for a big size in vertebrates.

replies(2): >>42173967 #>>42174406 #
3. sodality2 ◴[] No.42173861[source]
> “Shouldn’t you give neuron counts more weight in your estimates?”

Rethink Priorities [0] has a FAQ entry on this [1].

[0]: https://rethinkpriorities.org/research-area/welfare-range-es...

[1]: https://forum.effectivealtruism.org/posts/Mfq7KxQRvkeLnJvoB/...

replies(1): >>42174342 #
4. 0xDEAFBEAD ◴[] No.42173967[source]
Interesting.

I'm fairly puzzled by sensation/qualia. The idea that there's some chemical reaction in my brain which produces sensation as a side effect is very weird. In principle it seems like you ought to be able to pare things down in order to produce a "minimal chemical reaction" for suffering, and do "suffering chemistry" in a beaker (if you were feeling unethical). That's really trippy.

People often talk about suffering in conjunction with consciousness, but in my mind information processing and suffering are just different phenomena:

* Children aren't as good at information processing, but they are even more capable of suffering.

* I wouldn't liked to be kicked if I was sleeping, or blackout drunk, even if I was incapable of information processing at the time, and had no memory of the event.

So intuitively it seems like more neurons = more "suffering chemistry" = greater moral weight. However, I imagine that perhaps the amount of "suffering chemistry" required to motivate an organism is actually fairly constant regardless of its size. Same way a gigantic cargo ship and a small children's toy could in principle be controlled by the same tiny microchip. That could explain the moral weight result.

Interested to hear any thoughts.

replies(2): >>42174304 #>>42183853 #
5. adrian_b ◴[] No.42174304{3}[source]
While in animals with complex nervous systems like humans and also many mammals and birds there may be psychological reasons for suffering, like the absence or death of someone beloved, suffering from physical pain is present in most, if not all animals.

The sensation of pain is provided by dedicated sensory neurons, like other sensory neurons are specialized for sensing light, sound, smell, taste, temperature, tactile pressure, gravity, force in the muscles/tendons, electric currents, magnetic fields, radiant heat a.k.a. infrared light and so on (some of these sensors exist only in some non-human animals).

The pain-sensing neurons, a.k.a. nociceptors, can be identified anatomically in some of the better studied animals, including humans, but it is likely that they also exist in most other animals, with the possible exception of some parasitic or sedentary animals, where all the sense organs are strongly reduced.

So all animals with such sensory neurons that cause pain are certain to suffer.

The nociceptors are activated by various stimuli, e.g. either by otherwise normal stimuli that exceed some pain threshold, e.g. too intense light or noise, or by substances generated by damaged cells from their neighborhood.

replies(1): >>42174507 #
6. mistercow ◴[] No.42174342[source]
Which I referenced and called arbitrary.
replies(1): >>42174793 #
7. mistercow ◴[] No.42174406[source]
I totally agree that you can’t just do a 1:1 comparison. My point is not to say that a shrimp suffers .05% as much as a chicken, but to use a chicken as a point of reference to illustrate just how simple the nervous system of a shrimp is.

We’re talking about a scale here where we have to question whether the notion of suffering is applicable at all before we try to put it on any kind of spectrum.

8. 0xDEAFBEAD ◴[] No.42174507{4}[source]
Interesting. So how about counting nociceptors for moral weight?

What specifically makes it so the pain neurons cause pain and the pleasure neurons cause pleasure? Supposing I invented a sort of hybrid neuron, with some features of a pain neuron and some features of a pleasure neuron -- is there any way a neuroscientist could look at its structure+chemistry and predict whether it will produce pleasures vs pain?

replies(1): >>42176204 #
9. sodality2 ◴[] No.42174793{3}[source]
Your claim that it's arbitrary doesn't really have much weight without further reasoning.
replies(1): >>42175504 #
10. NoMoreNicksLeft ◴[] No.42175438[source]
> I don’t really doubt that it’s in principle possible to assign percentage values to suffering intensity, but the 3% value (which the source admits is a “placeholder”) seems completely unhinged for an animal with 0.05% as many neurons as a chicken,

There is a simple explanation for the confusion that this causes you and the other people in this thread: suffering's not real. It's a dumb gobbledygook term that in the most generous interpretation refers to a completely subjective experience that is not empirical or measurable.

The author uses the word "imagine" three times in the first two paragraphs for a reason. Then he follows up with a fake picture of anthropomorphic shrimp. This is some sort of con game. And you're all falling for it. He's not scamming money out of you, instead he wants to convert you to his religious-dietary-code-that-is-trying-to-become-a-religion.

Shrimp are food. They have zero moral weight.

replies(2): >>42175771 #>>42177293 #
11. mistercow ◴[] No.42175504{4}[source]
The problem is that the reasoning they give is so vague that there isn’t really anything to argue against. At best, they convincingly argue, in an extremely non-information-dense way, that neuron count isn’t everything, which is obviously true. They do not manage to argue convincingly that a 100k neuron system is something that we can even apply the word “suffering” to meaningfully.
replies(1): >>42178970 #
12. mistercow ◴[] No.42175771[source]
Denying the existence of something that you and everyone else has experienced is certainly an approach.

Look, I’m not going to defend the author here. The linked report reads to me like the output of a group of people who have become so insulated in their thinking on this subject that they’ve totally lost perspective. They give an 11% prior probability of earthworm sentience based on proxies like “avoiding noxious stimuli”, which is… really something.

But I’m not so confused by a bad set of arguments that I think suffering doesn’t exist.

replies(1): >>42176760 #
13. adrian_b ◴[] No.42176204{5}[source]
Even if this is not well understood, it is likely that any differences between the pain neurons and any other sensory neurons are not essential.

It is likely that it only matters where they are connected in the sensory paths that carry the information about sensations towards the central nervous system. Probably any signal coming into the central nervous system on those paths dedicated for pain is interpreted as pain, like a signal coming through the optical nerves would be interpreted as light, even when it would be caused by an impact on the head.

https://en.wikipedia.org/wiki/Nociception

14. NoMoreNicksLeft ◴[] No.42176760{3}[source]
> Denying the existence of something that you and everyone else has experienced is certainly an approach.

You've experienced this mystical thing, and so you know it's true?

> They give an 11% prior probability of earthworm sentience

I'm having trouble holding in the laughter. But you don't seem to understand how dangerously deranged these people are. They'll convert you to their religion by hook or crook.

replies(2): >>42179635 #>>42180467 #
15. abemiller ◴[] No.42177293[source]
Using some italics with an edgy claim doesn't allow you to cut through centuries of philosophy. It's almost as if, when philosophers have coined this term in language "subjective experience" and thousands have used it often in coherent discussion, that it actually has semantic value. It exists in the intersubjective space between people who communicate with shared concepts.

I don't have much to say about the shrimp, but I find it deeply sad when people convince themselves that they don't really exist as a thinking, feeling thing. It's self repression to the maximum, and carries the implication that yourself and all humans have no value.

If you don't have certain measurable proof either way, why would you choose to align with the most grim possible skeptical beliefs? Listen to some music or something - don't you hear the sounds?

replies(1): >>42177437 #
16. NoMoreNicksLeft ◴[] No.42177437{3}[source]
> Using some italics with an edgy claim

There is nothing edgy about it. You can't detect it, you can't measure it, and if the word had any applicability (to say, humans), then you're also misapplying it. If it is your contention that suffering is something-other-than-subjective, then you're the one trying to be edgy. Not I.

The way sane, reasonable people describe subjective phenomena that we can't detect or measure is "not real". When we're talking about decapods, it can't even be self-reported.

> but I find it deeply sad when people convince themselves that they don't really exist as a thinking, feeling thing. It's self repression to the maximum,

Says the guy agreeing with a faction that seeks to convince people shrimp are anything other than food. That if for some reason we need to euthanize them, that they must be laid down on a velvet pillow to listen to symphonic music and watch films of the beautiful Swiss mountain countryside until their last gasp.

"Sad" is letting yourself be manipulated so that some other religion can enforce its noodle-brained dietary laws on you.

> If you don't have certain measurable proof either way

I'm not obligated to prove the negative.

replies(2): >>42179987 #>>42180220 #
17. BenthamsBulldog ◴[] No.42178970{5}[source]
But if neuron counts, as they show, have no reliable correlation (and in some cases an inverse correlation) with valenced experience, why would we use them to rule out intense experiences?
replies(1): >>42180518 #
18. bulletsvshumans ◴[] No.42179635{4}[source]
Setting aside the shrimp, are you denying that any humans, including yourself, experience suffering?
replies(1): >>42184142 #
19. jhanschoo ◴[] No.42179987{4}[source]
> If it is your contention that suffering is something-other-than-subjective, then you're the one trying to be edgy.

You do feel pain and hunger, at least to the extent you experience touch. You can in fact be even more certain of that than anything conventionally thought to be objective, physical models of the world, for it is only through your perception that you receive those models, or evidence to build those models.

The notion of suffering used in the paper is primarily with respect to pain and pleasure.

Now, you may deny that shrimp feel pain and pleasure. It's also possible to deny that other people feel pain and pleasure. But you do feel pain and pleasure, and you always engage in behaviors in response to these sensations; your senses also inform you secondarily that many other people abide by similar rules.

Many animals like us are fundamentally sympathetic to pain and pleasure. That is, observing behavior related to pain and pleasure impels a related feeling ourselves, in certain contexts, not necessarily exact. This mechanism is quite obvious when you observe parents caring for their young, herd behavior, etc.. With this established, some people are in a context where they are sympathetic to observed pain and pleasure of nonhuman animals; in this case shrimp rather than cats and dogs, and such a study helps one figure out this relationship in more detail.

20. holden_nelson ◴[] No.42180220{4}[source]
> You can’t detect it, you can’t measure it

Eh, perhaps we can’t detect it perfectly reliably, but we can absolutely detect it. Go to a funeral and observe a widow in anguish. Just because we haven’t (yet) built a machine to detect or measure it doesn’t mean it doesn’t exist.

replies(1): >>42184182 #
21. mistercow ◴[] No.42180467{4}[source]
> You've experienced this mystical thing, and so you know it's true?

Suffering is experience, and my own internal experiences are the things that I can be most certain of. So in this case, yes. I don’t know why you’re calling it “mystical” though.

> They'll convert you to their religion by hook or crook.

I have a lot more confidence in my ability to evaluate arguments than you seem to.

22. mistercow ◴[] No.42180518{6}[source]
This feels like a severe motte and bailey. The motte is “neuron count is not perfectly linearly correlated with sentience” and the bailey is “neuron count has no reliable correlation with sentience”.

The priors referenced in the “technical details” doc (and good lord, why does everything about this require me to dereference three or four layers of pointers to get to basic answers to methodological questions?) appear to be based entirely on proxies like:

> Responses such as moving away, escaping, and avoidance, that seem to account for noxious stimuli intensity and direction.

This is a proxy that applies to slime molds and Roombas, yet I notice that neither of those made the table. Why not?

I suspect that the answer is that at least when it comes to having zero neurons, the correlation suddenly becomes pretty obviously reliable after all.

23. dleary ◴[] No.42183853{3}[source]
> Children aren't as good at information processing, but they are even more capable of suffering.

That is too strong a statement to just toss out there like that. And I don’t even think it’s true.

I think children probably feel pain more intensely than adults. But there are many more dimensions to suffering than pain. And many of those dimensions are beyond the ken of children.

Children will not know the suffering that comes from realizing that you have ruined a relationship that you value, and it was entirely your fault.

They will not know the kind of suffering that’s behind Imposter Syndrome after getting into MIT.

Or the suffering that comes from realizing that your heroin addiction will never be as good as the first time you shot up. Or that same heroin addict knowing that they are betraying their family, stealing their mother’s jewelry to pawn, and doing it anyway.

Or the suffering of a once-great athlete coming to terms with the fact that they are washed up and that life is over now.

Or the suffering behind their favorite band splitting up.

Or the suffering behind winning Silver at the Olympics.

Or the agony of childbirth.

Perhaps most importantly, one of the greatest sorrows of all: losing your own child.

Et cetera

24. NoMoreNicksLeft ◴[] No.42184142{5}[source]
Humans self-report "suffering". Strangely, those who claim it the most enthusiastically don't seem to be experiencing pain from disease or injury.

I would hesitate to use that word myself, though my personal experiences have, at times, been somewhat similar to those who do use the word.

25. NoMoreNicksLeft ◴[] No.42184182{5}[source]
> Eh, perhaps we can’t detect it perfectly reliably, but we can absolutely detect it. Go to a funeral and observe a widow in anguish.

If your definition of suffering describes both the widow grieving a lost husband and a shrimp slowly going whatever the equivalent is of unconscious in an icewater bath... it doesn't much seem to be a useful word.

> Just because we haven’t (yet) built a machine to

Yes, because we haven't built the machine, we can't much tell if the widow is in "anguish" or is putting on a show for the public. Some widows are living their most joyous days, but they can't always show it.