←back to thread

The shrimp welfare project

(benthams.substack.com)
81 points 0xDEAFBEAD | 6 comments | | HN request time: 0s | source | bottom
Show context
n4r9 ◴[] No.42173011[source]
Apologies for focusing on just one sentence of this article, but I feel like it's crucial to the overall argument:

> ... if [shrimp] suffer only 3% as intensely as we do ...

Does this proposition make sense? It's not obvious to me that we can assign percentage values to suffering, or compare it to human suffering, or treat the values in a linear fashion.

It reminds me of that vaguely absurd thought experiment where you compare one person undergoing a lifetime of intense torture vs billions upon billions of humans getting a fleck of dust in their eyes. I just cannot square choosing the former with my conscience. Maybe I'm too unimaginative to comprehend so many billions of bits of dust.

replies(10): >>42173107 #>>42173149 #>>42173164 #>>42173244 #>>42173255 #>>42173304 #>>42173441 #>>42175565 #>>42175936 #>>42177306 #
mistercow ◴[] No.42173304[source]
I don’t really doubt that it’s in principle possible to assign percentage values to suffering intensity, but the 3% value (which the source admits is a “placeholder”) seems completely unhinged for an animal with 0.05% as many neurons as a chicken, and the source’s justification for largely discounting neuron counts seems pretty arbitrary, at least as presented in their FAQ.
replies(3): >>42173750 #>>42173861 #>>42175438 #
1. sodality2 ◴[] No.42173861[source]
> “Shouldn’t you give neuron counts more weight in your estimates?”

Rethink Priorities [0] has a FAQ entry on this [1].

[0]: https://rethinkpriorities.org/research-area/welfare-range-es...

[1]: https://forum.effectivealtruism.org/posts/Mfq7KxQRvkeLnJvoB/...

replies(1): >>42174342 #
2. mistercow ◴[] No.42174342[source]
Which I referenced and called arbitrary.
replies(1): >>42174793 #
3. sodality2 ◴[] No.42174793[source]
Your claim that it's arbitrary doesn't really have much weight without further reasoning.
replies(1): >>42175504 #
4. mistercow ◴[] No.42175504{3}[source]
The problem is that the reasoning they give is so vague that there isn’t really anything to argue against. At best, they convincingly argue, in an extremely non-information-dense way, that neuron count isn’t everything, which is obviously true. They do not manage to argue convincingly that a 100k neuron system is something that we can even apply the word “suffering” to meaningfully.
replies(1): >>42178970 #
5. BenthamsBulldog ◴[] No.42178970{4}[source]
But if neuron counts, as they show, have no reliable correlation (and in some cases an inverse correlation) with valenced experience, why would we use them to rule out intense experiences?
replies(1): >>42180518 #
6. mistercow ◴[] No.42180518{5}[source]
This feels like a severe motte and bailey. The motte is “neuron count is not perfectly linearly correlated with sentience” and the bailey is “neuron count has no reliable correlation with sentience”.

The priors referenced in the “technical details” doc (and good lord, why does everything about this require me to dereference three or four layers of pointers to get to basic answers to methodological questions?) appear to be based entirely on proxies like:

> Responses such as moving away, escaping, and avoidance, that seem to account for noxious stimuli intensity and direction.

This is a proxy that applies to slime molds and Roombas, yet I notice that neither of those made the table. Why not?

I suspect that the answer is that at least when it comes to having zero neurons, the correlation suddenly becomes pretty obviously reliable after all.