Most active commenters
  • 0xDEAFBEAD(8)
  • mistercow(7)
  • BenthamsBulldog(7)
  • n4r9(6)
  • NoMoreNicksLeft(5)
  • sodality2(4)
  • adrian_b(3)
  • jhanschoo(3)

←back to thread

The shrimp welfare project

(benthams.substack.com)
81 points 0xDEAFBEAD | 63 comments | | HN request time: 1.431s | source | bottom
1. n4r9 ◴[] No.42173011[source]
Apologies for focusing on just one sentence of this article, but I feel like it's crucial to the overall argument:

> ... if [shrimp] suffer only 3% as intensely as we do ...

Does this proposition make sense? It's not obvious to me that we can assign percentage values to suffering, or compare it to human suffering, or treat the values in a linear fashion.

It reminds me of that vaguely absurd thought experiment where you compare one person undergoing a lifetime of intense torture vs billions upon billions of humans getting a fleck of dust in their eyes. I just cannot square choosing the former with my conscience. Maybe I'm too unimaginative to comprehend so many billions of bits of dust.

replies(10): >>42173107 #>>42173149 #>>42173164 #>>42173244 #>>42173255 #>>42173304 #>>42173441 #>>42175565 #>>42175936 #>>42177306 #
2. sodality2 ◴[] No.42173107[source]
Have you read the linked paper by Norcross? "Great harms from small benefits grow: how death can be outweighed by headaches" [0].

[0]: https://www.jstor.org/stable/3328486

replies(2): >>42173211 #>>42174422 #
3. InsideOutSanta ◴[] No.42173149[source]
The article mentions that issue in passing ("I reject the claim that no number of mild bads can add up to be as bad as a single thing that’s very bad, as do many philosophers"), but I don't understand the actual argument behind this assertion.

Personally, I believe that you can't just add up mildly bad things and create a very bad thing. For example, I'd rather get my finger pricked by a needle once a day for the rest of my life than have somebody amputate my legs without anesthesia just once, even though the "cumulative pain" of the former choice might be higher than that of the latter.

Having said that, I also believe that there is sufficient evidence that shrimp suffer greatly when they are killed in the manner described in the article, and that it is worthwhile to prevent that suffering.

replies(1): >>42173260 #
4. dfedbeef ◴[] No.42173164[source]
It is regular absurd.
5. n4r9 ◴[] No.42173211[source]
No; thanks for bringing it to my attention. The first page is intriguing... I'll see if I can locate a free copy somewhere.
replies(1): >>42173263 #
6. aithrowawaycomm ◴[] No.42173244[source]
Yeah this (along with the "billion headaches" inanity) rests on a fallacy: insisting an abstraction can be measured as a quantity when it clearly cannot. This trick is usually done by blindly averaging together some concrete quantities and claiming it represents the abstraction. The illusion is fostered by "local continuity" of these abstractions - if pulling your earlobe causes suffering, pulling harder causes more suffering. And of course the "mathiness" gives an aura of rigor and rationality. But a terrible error in quantitative reasoning occurs when you break locality: going from pulled earlobe to emotional loss, or pulled earlobe to pulled antennae, etc. The very nature of the abstraction - "suffering," "badness," - changes between entities and situations, so the one formula cannot possibly apply.

ETA: see also the McNamara fallacy https://en.wikipedia.org/wiki/McNamara_fallacy

replies(1): >>42177744 #
7. ◴[] No.42173255[source]
8. aithrowawaycomm ◴[] No.42173260[source]
Their point isn't that it's merely "worthwhile," but that donating to Sudanese refugees is a waste of money because 1 starving child = 80 starving shrimp, or whatever their ghoulish and horrific math says.
replies(2): >>42173577 #>>42175592 #
9. sodality2 ◴[] No.42173263{3}[source]
Here's a copy I found: https://philosophysmith.com/wp-content/uploads/2018/07/alist...

It's pretty short, I liked it. Was surprised to find myself agreeing with it at the end of my first read.

replies(1): >>42187128 #
10. mistercow ◴[] No.42173304[source]
I don’t really doubt that it’s in principle possible to assign percentage values to suffering intensity, but the 3% value (which the source admits is a “placeholder”) seems completely unhinged for an animal with 0.05% as many neurons as a chicken, and the source’s justification for largely discounting neuron counts seems pretty arbitrary, at least as presented in their FAQ.
replies(3): >>42173750 #>>42173861 #>>42175438 #
11. 0xDEAFBEAD ◴[] No.42173441[source]
The way I think about it is that we're already making decisions like this in our own lives. Imagine a teenager who gets a summer job so they can save for a PS5. The teenager is making an implicit moral judgement, with themselves as the only moral patient. They're judging that the negative utility from working the job is lower in magnitude than the positive utility that the PS5 would generate.

If the teenager gets a job offer, but the job only pays minimum wage, they may judge that the disutility for so many hours of work actually exceeds the positive utility from the PS5. There seems to be a capability to estimate the disutility from a single hour of work, and multiply it across all the hours which will be required to save enough.

It would be plausible for the teenager to argue that the disutility from the job exceeds the utility from the PS5, or vice versa. But I doubt many teenagers would tell you "I can't figure out if I want to get a job, because the utilities simply aren't comparable!" Incomparability just doesn't seem to be an issue in practice for people making decisions about their own lives.

Here's another thought experiment. Imagine you get laid off from your job. Times are tough, and your budget is tight. Christmas is coming up. You have two children and a pet. You could get a fancy present for Child A, or a fancy present for Child B, but not both. If you do buy a fancy present, the only way to make room in the budget is to switch to a less tasty food brand for your pet.

This might be a tough decision if the utilities are really close. But if you think your children will mostly ignore their presents in order to play on their phones, and your pet gets incredibly excited every time you feed them the more expensive food brand, I doubt you'll hesitate on the basis of cross-species incomparability.

I would argue that the shrimp situation sits closer to these sort of every-day "common sense" utility judgments than an exotic limiting case such as torture vs dust specks. I'm not sure dust specks have any negative utility at all, actually. Maybe they're even positive utility, if they trigger a blink which is infinitesimally pleasant. If I change it from specks to bee stings, it seems more intuitive that there's some astronomically large number of bee stings such that torture would be preferable.

It's also not clear to me what I should do when my intuitions and mathematical common sense come into conflict. As you suggest, maybe if I spent more time really trying to wrap my head around how astronomically large a number can get, my intuition would line up better with math.

Here's a question on the incomparability of excruciating pain. Back to the "moral judgements for oneself" theme... How many people would agree to get branded with a hot branding iron in exchange for a billion dollars? I'll bet at least a few would agree.

replies(2): >>42173720 #>>42174022 #
12. 0xDEAFBEAD ◴[] No.42173577{3}[source]
>donating to Sudanese refugees is a waste of money

Donating to Sudanese refugees sounds like a great use of money. Certainly not a waste.

Suboptimal isn't the same as wasteful. Suppose you sit down to eat a great meal at a restaurant. As you walk out, you realize that you could have gotten an even better meal for the same price at the restaurant next door. That doesn't mean you just wasted your money.

>ghoulish and horrific math

It's not the math that's horrific, it's the world we live in that's horrific. The math just helps us alleviate the horror better.

Researcher: "Here's my study which shows that a new medication reduces the incidence of incredibly painful kidney stones by 50%." Journal editorial board: "We refuse to publish this ghoulish and horrific math."

13. hansvm ◴[] No.42173720[source]
> How many people would agree to get branded with a hot branding iron in exchange for a billion dollars?

Temporary pain without any meaningful lasting injuries? I do worse long-term damage than that at my actual job just in neck and wrist damage and not being sufficiently active (on a good day I get 1-2hrs, but that doesn't leave much time for other things), and I'm definitely not getting paid a billion for it.

replies(1): >>42174000 #
14. adrian_b ◴[] No.42173750[source]
The ratio of the neuron numbers may be somewhat meaningful when comparing vertebrates with vertebrates and arthropods with arthropods, but it is almost completely meaningless when comparing vertebrates with arthropods.

The reason is that the structure of the nervous systems of arthropods is quite different from that of the vertebrates. Comparing them is like comparing analog circuits and digital circuits that implement the same function, e.g. a number multiplier. The analog circuit may have a dozen transistors and the digital circuit may have hundreds of transistors, but they do the same thing (with different performance characteristics).

The analogy with comparing analog and digital circuits is quite appropriate, because parts of the nervous systems that have the same function, e.g. controlling a leg muscle, may have hundreds or thousands of neurons in a vertebrate, which function in an all-or-nothing manner, while in an arthropod the equivalent part may have only a few neurons that function in a much more complex manner in order to achieve fine control of the leg movement.

So typically one arthropod neuron is equivalent with much more vertebrate neurons, e.g. hundreds or even thousands.

This does not mean that the nervous system of arthropods is better than that of vertebrates. They are optimized for different criteria. Neither a vertebrate can become as small as the smallest arthropods, nor an arthropod can become as big as the bigger vertebrates, the systems that integrate the organs of a body into a single living organism, i.e. the nervous system and the circulatory and respiratory systems, are optimized for a small size in arthropods and for a big size in vertebrates.

replies(2): >>42173967 #>>42174406 #
15. sodality2 ◴[] No.42173861[source]
> “Shouldn’t you give neuron counts more weight in your estimates?”

Rethink Priorities [0] has a FAQ entry on this [1].

[0]: https://rethinkpriorities.org/research-area/welfare-range-es...

[1]: https://forum.effectivealtruism.org/posts/Mfq7KxQRvkeLnJvoB/...

replies(1): >>42174342 #
16. 0xDEAFBEAD ◴[] No.42173967{3}[source]
Interesting.

I'm fairly puzzled by sensation/qualia. The idea that there's some chemical reaction in my brain which produces sensation as a side effect is very weird. In principle it seems like you ought to be able to pare things down in order to produce a "minimal chemical reaction" for suffering, and do "suffering chemistry" in a beaker (if you were feeling unethical). That's really trippy.

People often talk about suffering in conjunction with consciousness, but in my mind information processing and suffering are just different phenomena:

* Children aren't as good at information processing, but they are even more capable of suffering.

* I wouldn't liked to be kicked if I was sleeping, or blackout drunk, even if I was incapable of information processing at the time, and had no memory of the event.

So intuitively it seems like more neurons = more "suffering chemistry" = greater moral weight. However, I imagine that perhaps the amount of "suffering chemistry" required to motivate an organism is actually fairly constant regardless of its size. Same way a gigantic cargo ship and a small children's toy could in principle be controlled by the same tiny microchip. That could explain the moral weight result.

Interested to hear any thoughts.

replies(2): >>42174304 #>>42183853 #
17. 0xDEAFBEAD ◴[] No.42174000{3}[source]
Sorry to hear about your neck and wrist. I like this site:

https://www.painscience.com/

This article was especially helpful:

https://www.painscience.com/tutorials/trigger-points.php

I suspect the damage you're concerned about is reversible, if you're sufficiently persistent with research and experimentation. That's been my experience with chronic pain.

18. sixo ◴[] No.42174022[source]
> The teenager is making an implicit moral judgement, with themselves as the only moral patient.

No they're not! You have made a claim of the form "these things are the same thing"—but it only seems that way if you can't think of a single plausible alternative. Here's one:

* Humans are motivated by two competing drives. The first drive we can call "fear", which aims to avoid suffering, either personally or in people you care about or identify with. This derives from our natural empathic instinct, but is can be extended by a socially-construction of group identity. So, the shrimp argument is saying "your avoiding-suffering instinct can and should be applied to crustaceans too", which is contrary to how most people feel. Fear also includes "fear of ostracization", this being equivalent to death in a prehistoric context.

* The second drive is "thriving" or "growing" or "becoming yourself", and leads you to glimpse the person you could be, things you could do, identities you could hold, etc, and to strive to transform yourself into those things. The teenager ultimately wants the PS5 because they've identified with it in some way—they see it as a way to express themself. Their "utilitarian" actions in this context are instrumental, not moral—towards the attainment of what-they-want. I think, in this simple model, I'd also broader this drive to include "eating meat"—you don't do this for the animal or to abate suffering, you do it because you want to: your body's hungry, you desire the pleasure of satiation, and you act to realize that desire.

* The two drives are not the same, and in the case of eating meat are directly opposed. (You could perhaps devise a way to see either as, ultimately, an expression of the other.) Human nature, then, basically undertakes the "thriving" drive except when there's a threat of suffering, in which case we switch gears to "fear" until it's handled.

* Much utilitarian discourse seems to exist in a universe where the apparently-selfish "thriving" drive doesn't exist, or has been moralized out of existence—because it doesn't look good on paper. But, however it sounds, it in fact exists, and you will find that almost all living humans will defend their right to express themselves, sometimes to the death. This is at some level the essence of life, and the rejection of it leads many people to view EA-type utilitarianism as antithetical to life itself.

* One reason for this is that "fear-mode thinking" is cognitively expensive, and while people will maintain it for a while, they will eventually balk against it, no matter how reasonable it seems (probably this explains the last decade of American politics).

replies(1): >>42174453 #
19. adrian_b ◴[] No.42174304{4}[source]
While in animals with complex nervous systems like humans and also many mammals and birds there may be psychological reasons for suffering, like the absence or death of someone beloved, suffering from physical pain is present in most, if not all animals.

The sensation of pain is provided by dedicated sensory neurons, like other sensory neurons are specialized for sensing light, sound, smell, taste, temperature, tactile pressure, gravity, force in the muscles/tendons, electric currents, magnetic fields, radiant heat a.k.a. infrared light and so on (some of these sensors exist only in some non-human animals).

The pain-sensing neurons, a.k.a. nociceptors, can be identified anatomically in some of the better studied animals, including humans, but it is likely that they also exist in most other animals, with the possible exception of some parasitic or sedentary animals, where all the sense organs are strongly reduced.

So all animals with such sensory neurons that cause pain are certain to suffer.

The nociceptors are activated by various stimuli, e.g. either by otherwise normal stimuli that exceed some pain threshold, e.g. too intense light or noise, or by substances generated by damaged cells from their neighborhood.

replies(1): >>42174507 #
20. mistercow ◴[] No.42174342{3}[source]
Which I referenced and called arbitrary.
replies(1): >>42174793 #
21. mistercow ◴[] No.42174406{3}[source]
I totally agree that you can’t just do a 1:1 comparison. My point is not to say that a shrimp suffers .05% as much as a chicken, but to use a chicken as a point of reference to illustrate just how simple the nervous system of a shrimp is.

We’re talking about a scale here where we have to question whether the notion of suffering is applicable at all before we try to put it on any kind of spectrum.

22. probably_wrong ◴[] No.42174422[source]
I read the paper and I believe the same objections applies: the reasoning only works if you assume "pain" to be a constant number subject to the additive property.

If we have to use math, I'd say: the headaches are temporal - the effect of all the good you've done today is effectively gone tomorrow one way or another. But killing a person means, to quote "Unforgiven", that "you take away everything he's got and everything he's ever gonna have". So the calculation needs at least a temporal discount factor.

I also believe that the examples are too contrived to be actually useful. Comparing a room with one person to another with five million is like comparing the fine for a person traveling at twice the speed limit with that of someone traveling at 10% the speed of light - the results of such an analysis are entertaining to think about, but not actually useful.

replies(1): >>42175614 #
23. 0xDEAFBEAD ◴[] No.42174453{3}[source]
I find myself motivated to alleviate suffering in other beings. It feels good that a quarter million shrimp are better off because I donated a few hundred bucks. It makes me feel like my existence on this planet is worthwhile. I did my good deed for the day.

There was a time when my good deeds were more motivated by fear. I found that fear wasn't a good motivator. This has become the consensus view in the EA community. EAs generally think it's important to avoid burnout. After reworking my motivations, doing good now feels like a way to thrive, not a way to avoid fear. The part of me which was afraid feels good about this development, because my new motivational structure is more sustainable.

If you're not motivated to alleviate suffering in other beings, it is what it is. I'm not going to insult you or anything. However, if I notice you insulting others over moral trifles, I might privately think to myself that you are being hyperbolic. When I put on my EA-type utilitarian hat on, almost all internet fighting seems to lack perspective.

I support your ability to express yourself. (I'm a little skeptical that's the main driver of the typical PS5 purchase, but that's beside the point.) I want you to thrive! I consume meat, so I can't condemn you for consuming meat. I did try going vegan for a bit, but a vegan diet was causing fatigue. I now make a mild effort to eat a low-suffering diet. I also donate to https://gfi.org/ to support research into alternative meats. (I think it's plausible that the utilitarian impact of my diet+donations is net positive, since the invention of viable alternative meats could have such a large impact.) And whenever I get the chance, I rant about the state of vegan nutrition online, in the hope that vegans will notice my rants and improve things.

(Note that I'm not a member of the EA community, but I agree with aspects of the philosophy. My issues with the community can go in another thread.)

(I appreciate you writing this reply. Specifically, I find myself wondering if utilitarian advocacy would be more effective if what I just wrote, about the value of rejecting fear-style motivation, was made explicit from the beginning. It could make utilitarianism both more appealing and more sustainable.)

replies(1): >>42180156 #
24. 0xDEAFBEAD ◴[] No.42174507{5}[source]
Interesting. So how about counting nociceptors for moral weight?

What specifically makes it so the pain neurons cause pain and the pleasure neurons cause pleasure? Supposing I invented a sort of hybrid neuron, with some features of a pain neuron and some features of a pleasure neuron -- is there any way a neuroscientist could look at its structure+chemistry and predict whether it will produce pleasures vs pain?

replies(1): >>42176204 #
25. sodality2 ◴[] No.42174793{4}[source]
Your claim that it's arbitrary doesn't really have much weight without further reasoning.
replies(1): >>42175504 #
26. NoMoreNicksLeft ◴[] No.42175438[source]
> I don’t really doubt that it’s in principle possible to assign percentage values to suffering intensity, but the 3% value (which the source admits is a “placeholder”) seems completely unhinged for an animal with 0.05% as many neurons as a chicken,

There is a simple explanation for the confusion that this causes you and the other people in this thread: suffering's not real. It's a dumb gobbledygook term that in the most generous interpretation refers to a completely subjective experience that is not empirical or measurable.

The author uses the word "imagine" three times in the first two paragraphs for a reason. Then he follows up with a fake picture of anthropomorphic shrimp. This is some sort of con game. And you're all falling for it. He's not scamming money out of you, instead he wants to convert you to his religious-dietary-code-that-is-trying-to-become-a-religion.

Shrimp are food. They have zero moral weight.

replies(2): >>42175771 #>>42177293 #
27. mistercow ◴[] No.42175504{5}[source]
The problem is that the reasoning they give is so vague that there isn’t really anything to argue against. At best, they convincingly argue, in an extremely non-information-dense way, that neuron count isn’t everything, which is obviously true. They do not manage to argue convincingly that a 100k neuron system is something that we can even apply the word “suffering” to meaningfully.
replies(1): >>42178970 #
28. BenthamsBulldog ◴[] No.42175565[source]
Seems possible in principle. Experiences can cause one to feel more or less pain--what's wrong with quantifying it? Sure it will be a bit handwavy and vague, but the alternative of doing no comparisons and just going based on vibes is worse https://www.goodthoughts.blog/p/refusing-to-quantify-is-refu.... But as I argue, given high uncertainty, you don't need any fine grained estimates to think giving to shrimp welfare is valuable. Like, if there was a dollar in front of you and you could use it to save 16,000 shrimp, seems like that's a good use of it.
replies(1): >>42175658 #
29. BenthamsBulldog ◴[] No.42175592{3}[source]
It's not a waste as another commenter noted, just probably not the best use of money.

I agree this is unintuitive, but I submit that's because of speceisism. What about shrimp makes it so that tens of millions of them painfully dying is less bad than a single human death? It doesn't seem like the fact that they aren't smart makes their extreme agony less bad (the badness of a headache doesn't depend on how smart you are).

replies(1): >>42176515 #
30. BenthamsBulldog ◴[] No.42175614{3}[source]
No, that isn't true. We can consider some metric like being at some temperature for an hour. Start with some truly torturous pain like being at 500 degrees for an hour (you'd die quickly, ofc). One person being at 500 degrees is less bad than 10 at 499 degrees which is less bad than 100 at 498 degrees...which is less bad than some number at 85 degrees (not torture, just a bit unpleasant).
replies(1): >>42176288 #
31. kaashif ◴[] No.42175658[source]
> Like, if there was a dollar in front of you and you could use it to save 16,000 shrimp, seems like that's a good use of it.

Uhh, that's totally unintuitive and surely almost all people would disagree, right?

If not in words, people disagree in actions. Even within effective altruism there are a lot of people only giving to human centred causes.

32. mistercow ◴[] No.42175771{3}[source]
Denying the existence of something that you and everyone else has experienced is certainly an approach.

Look, I’m not going to defend the author here. The linked report reads to me like the output of a group of people who have become so insulated in their thinking on this subject that they’ve totally lost perspective. They give an 11% prior probability of earthworm sentience based on proxies like “avoiding noxious stimuli”, which is… really something.

But I’m not so confused by a bad set of arguments that I think suffering doesn’t exist.

replies(1): >>42176760 #
33. jjcm ◴[] No.42175936[source]
No the proposition doesn’t make sense. The 3% number comes from this: https://rethinkpriorities.org/research-area/welfare-range-es...

The page gives 3% to shrimp because their lifespan is 3% that of humans. It’s a terrible avenue for this estimate. By the same estimate, giant tortoises are less ethical to kill than humans. The heavens alone can judge you for the war crimes you’d be committing by killing a Turritopsis dohrnii.

Number of neurons is the least-bad objective measurement in my eyes. Arthropods famously have very few neurons, <100k compared to 86b in humans. That’s a 1:1000000 neuron ratio, which feels like a more appropriate ratio for suffering than a lifespan-based ratio, though both are terrible.

replies(2): >>42176353 #>>42178958 #
34. adrian_b ◴[] No.42176204{6}[source]
Even if this is not well understood, it is likely that any differences between the pain neurons and any other sensory neurons are not essential.

It is likely that it only matters where they are connected in the sensory paths that carry the information about sensations towards the central nervous system. Probably any signal coming into the central nervous system on those paths dedicated for pain is interpreted as pain, like a signal coming through the optical nerves would be interpreted as light, even when it would be caused by an impact on the head.

https://en.wikipedia.org/wiki/Nociception

35. n4r9 ◴[] No.42176288{4}[source]
I think OP's objection is that - even granting that a "badness value" can be assigned to headaches and that 3 people with headaches is worse than 2 - there's no clear reason to suppose that 3 is exactly half again as bad as 2. It may be that the function mapping headaches to badness is logarithmic, or even that it asymptotes towards some limit. In mathematical terms it can be both monotonic and bounded.

Thus, when comparing headaches to a man being tortured, there's no clear reason to suppose that there is a number of headaches that is worse than the torture.

replies(2): >>42177503 #>>42178980 #
36. aziaziazi ◴[] No.42176353[source]
Not only lifespan. From the link you quote:

> Capacity for welfare = welfare range × lifespan. An individual’s welfare range is the difference between the best and worst welfare states the individual can realize.

> we rely on indirect measures even in humans: behavior, physiological changes, and verbal reports. We can observe behavior and physiological changes in nonhumans, but most of them aren’t verbal. So, we have to rely on other indirect proxies, piecing together an understanding from animals’ cognitive and affective traits or capabilities.

First time I see this "warfare range" notion and it seems quite clever to me.

Also the original article says 3.1% is the median while the mean is 19%. I guess that may be caused by individuals havûg différents experiences each other’s.

replies(1): >>42178954 #
37. Vecr ◴[] No.42176515{4}[source]
How much of your posting is sophistry? I assume this isn't (I doubt this increases the positivity of the perception of EA), but the God stuff makes very close to no sense at all.

If it's sophistry anyway, can't you take Eliezer's position and say God doesn't exist, and some CEV like system is better than Bentham style utilitarianism because there's not an objective morality?

I don't think CEV makes much sense, but I think you're scoring far less points that you think you are even relative to something like that.

38. NoMoreNicksLeft ◴[] No.42176760{4}[source]
> Denying the existence of something that you and everyone else has experienced is certainly an approach.

You've experienced this mystical thing, and so you know it's true?

> They give an 11% prior probability of earthworm sentience

I'm having trouble holding in the laughter. But you don't seem to understand how dangerously deranged these people are. They'll convert you to their religion by hook or crook.

replies(2): >>42179635 #>>42180467 #
39. abemiller ◴[] No.42177293{3}[source]
Using some italics with an edgy claim doesn't allow you to cut through centuries of philosophy. It's almost as if, when philosophers have coined this term in language "subjective experience" and thousands have used it often in coherent discussion, that it actually has semantic value. It exists in the intersubjective space between people who communicate with shared concepts.

I don't have much to say about the shrimp, but I find it deeply sad when people convince themselves that they don't really exist as a thinking, feeling thing. It's self repression to the maximum, and carries the implication that yourself and all humans have no value.

If you don't have certain measurable proof either way, why would you choose to align with the most grim possible skeptical beliefs? Listen to some music or something - don't you hear the sounds?

replies(1): >>42177437 #
40. hazbot ◴[] No.42177306[source]
I'm willing to run with the 3% figure... But I take issue with the linearity assumption that torturing 34 shrimp is thus worse than torturing a human!
41. NoMoreNicksLeft ◴[] No.42177437{4}[source]
> Using some italics with an edgy claim

There is nothing edgy about it. You can't detect it, you can't measure it, and if the word had any applicability (to say, humans), then you're also misapplying it. If it is your contention that suffering is something-other-than-subjective, then you're the one trying to be edgy. Not I.

The way sane, reasonable people describe subjective phenomena that we can't detect or measure is "not real". When we're talking about decapods, it can't even be self-reported.

> but I find it deeply sad when people convince themselves that they don't really exist as a thinking, feeling thing. It's self repression to the maximum,

Says the guy agreeing with a faction that seeks to convince people shrimp are anything other than food. That if for some reason we need to euthanize them, that they must be laid down on a velvet pillow to listen to symphonic music and watch films of the beautiful Swiss mountain countryside until their last gasp.

"Sad" is letting yourself be manipulated so that some other religion can enforce its noodle-brained dietary laws on you.

> If you don't have certain measurable proof either way

I'm not obligated to prove the negative.

replies(2): >>42179987 #>>42180220 #
42. sdwr ◴[] No.42177503{5}[source]
That's reversed. The number of people can be mapped linearly, but not the intensity of the pain.

(Intuitively, it's hard to say saving 100 people is 100x as good as saving 1, because we can't have 100 best friends, but it doesn't affect the math at all)

43. sdwr ◴[] No.42177744[source]
It's not about the numbers. The core argument is:

- they suffer

- we are good people who care about reducing suffering

- so we spend our resources to reduce their suffering

And some (most!) people balk at one of those steps

But seriously, pain is the abstraction already. It's damage to the body represented as a feeling.

44. BenthamsBulldog ◴[] No.42178954{3}[source]
This is to model uncertainty not difference across species. The 50th percentile guess is that shrimp feel pain 3.1% as intensely as humans while the mean is 19%
45. BenthamsBulldog ◴[] No.42178958[source]
No, it's not about lifespan. They have a very complicated report that tries to quantify intensity of suffering relying on a variety of proxies
46. BenthamsBulldog ◴[] No.42178970{6}[source]
But if neuron counts, as they show, have no reliable correlation (and in some cases an inverse correlation) with valenced experience, why would we use them to rule out intense experiences?
replies(1): >>42180518 #
47. BenthamsBulldog ◴[] No.42178980{5}[source]
Right but Norcross gives an argument against that. You either have to think that slightly reducing pain and making it 1000 times as prevalent doesn't always make a worse state of affairs or deny that worse than is transitive.
replies(1): >>42187038 #
48. bulletsvshumans ◴[] No.42179635{5}[source]
Setting aside the shrimp, are you denying that any humans, including yourself, experience suffering?
replies(1): >>42184142 #
49. jhanschoo ◴[] No.42179987{5}[source]
> If it is your contention that suffering is something-other-than-subjective, then you're the one trying to be edgy.

You do feel pain and hunger, at least to the extent you experience touch. You can in fact be even more certain of that than anything conventionally thought to be objective, physical models of the world, for it is only through your perception that you receive those models, or evidence to build those models.

The notion of suffering used in the paper is primarily with respect to pain and pleasure.

Now, you may deny that shrimp feel pain and pleasure. It's also possible to deny that other people feel pain and pleasure. But you do feel pain and pleasure, and you always engage in behaviors in response to these sensations; your senses also inform you secondarily that many other people abide by similar rules.

Many animals like us are fundamentally sympathetic to pain and pleasure. That is, observing behavior related to pain and pleasure impels a related feeling ourselves, in certain contexts, not necessarily exact. This mechanism is quite obvious when you observe parents caring for their young, herd behavior, etc.. With this established, some people are in a context where they are sympathetic to observed pain and pleasure of nonhuman animals; in this case shrimp rather than cats and dogs, and such a study helps one figure out this relationship in more detail.

50. jhanschoo ◴[] No.42180156{4}[source]
Hi, I'm a flexitarian who finds a more restrictive diet than the one I am having unsustainable in terms of my habits, so I'm interested in information regarding the topic. Can you direct me to what you've discussed regarding vegan nutrition?
replies(1): >>42180388 #
51. holden_nelson ◴[] No.42180220{5}[source]
> You can’t detect it, you can’t measure it

Eh, perhaps we can’t detect it perfectly reliably, but we can absolutely detect it. Go to a funeral and observe a widow in anguish. Just because we haven’t (yet) built a machine to detect or measure it doesn’t mean it doesn’t exist.

replies(1): >>42184182 #
52. 0xDEAFBEAD ◴[] No.42180388{5}[source]
https://news.ycombinator.com/item?id=41820828

https://forum.effectivealtruism.org/posts/dbw2mgSGSAKB45fAk/...

Really I want to see vegans do a comprehensive investigation of every last nutrient that's disproportionately found in animal products, including random stuff like beta-alanine, creatine, choline, etc., and take a "better safe than sorry" approach of inventing a veggie burger that contains all that stuff in abundance, and is palatable.

I suspect you could make a lot of money by inventing such a burger. Vegans are currently fixated on improving taste, and they seem to have a bit of a blind spot around nutrition. I expect a veggie burger which mysteriously makes vegans feel good, gives them energy, and makes them feel like they should eat more of it will tend to sell well.

replies(4): >>42181101 #>>42182886 #>>42187548 #>>42192690 #
53. mistercow ◴[] No.42180467{5}[source]
> You've experienced this mystical thing, and so you know it's true?

Suffering is experience, and my own internal experiences are the things that I can be most certain of. So in this case, yes. I don’t know why you’re calling it “mystical” though.

> They'll convert you to their religion by hook or crook.

I have a lot more confidence in my ability to evaluate arguments than you seem to.

54. mistercow ◴[] No.42180518{7}[source]
This feels like a severe motte and bailey. The motte is “neuron count is not perfectly linearly correlated with sentience” and the bailey is “neuron count has no reliable correlation with sentience”.

The priors referenced in the “technical details” doc (and good lord, why does everything about this require me to dereference three or four layers of pointers to get to basic answers to methodological questions?) appear to be based entirely on proxies like:

> Responses such as moving away, escaping, and avoidance, that seem to account for noxious stimuli intensity and direction.

This is a proxy that applies to slime molds and Roombas, yet I notice that neither of those made the table. Why not?

I suspect that the answer is that at least when it comes to having zero neurons, the correlation suddenly becomes pretty obviously reliable after all.

55. jhanschoo ◴[] No.42181101{6}[source]
Thanks!
56. aziaziazi ◴[] No.42182886{6}[source]
That would be a game changer. I also crave for burgers and taste/texture of most of them improved drastically but nutrients aren’t the first objective it seems, appart protein and sometime calcium/b12.

I often eat complemented processed food (~1serve/day) primary to transform lazy-meals (fries, pizza, sandwiches…) with a lazy-healthy meal. When I stopped flesh those became even more useful.

I recommend the bars (no preparation) and the "pots" (salty meals). Shakers are good and cheap. All seems very balanced and complete nutritiously.

http://jimmyjoy.com/ Not affiliated in any way, just an happy customer.

57. dleary ◴[] No.42183853{4}[source]
> Children aren't as good at information processing, but they are even more capable of suffering.

That is too strong a statement to just toss out there like that. And I don’t even think it’s true.

I think children probably feel pain more intensely than adults. But there are many more dimensions to suffering than pain. And many of those dimensions are beyond the ken of children.

Children will not know the suffering that comes from realizing that you have ruined a relationship that you value, and it was entirely your fault.

They will not know the kind of suffering that’s behind Imposter Syndrome after getting into MIT.

Or the suffering that comes from realizing that your heroin addiction will never be as good as the first time you shot up. Or that same heroin addict knowing that they are betraying their family, stealing their mother’s jewelry to pawn, and doing it anyway.

Or the suffering of a once-great athlete coming to terms with the fact that they are washed up and that life is over now.

Or the suffering behind their favorite band splitting up.

Or the suffering behind winning Silver at the Olympics.

Or the agony of childbirth.

Perhaps most importantly, one of the greatest sorrows of all: losing your own child.

Et cetera

58. NoMoreNicksLeft ◴[] No.42184142{6}[source]
Humans self-report "suffering". Strangely, those who claim it the most enthusiastically don't seem to be experiencing pain from disease or injury.

I would hesitate to use that word myself, though my personal experiences have, at times, been somewhat similar to those who do use the word.

59. NoMoreNicksLeft ◴[] No.42184182{6}[source]
> Eh, perhaps we can’t detect it perfectly reliably, but we can absolutely detect it. Go to a funeral and observe a widow in anguish.

If your definition of suffering describes both the widow grieving a lost husband and a shrimp slowly going whatever the equivalent is of unconscious in an icewater bath... it doesn't much seem to be a useful word.

> Just because we haven’t (yet) built a machine to

Yes, because we haven't built the machine, we can't much tell if the widow is in "anguish" or is putting on a show for the public. Some widows are living their most joyous days, but they can't always show it.

60. n4r9 ◴[] No.42187038{6}[source]
What are the arguments for "worse than" being transitive? It's not immediately clear to me that it ought to be.
61. n4r9 ◴[] No.42187128{4}[source]
Thanks. It's very clever how he uses probabilistic variants to move between scenarios. I've read to the end and whilst I'm not convinced, it's definitely given me food for thought. I'm stuck on two bits so far:

* He slides between talking about personal decisions vs decisions about someone else. The argument for Headache is couched in terms of whether an average person would drive to the chemist. Whilst the argument for shifting from Headache to Many Headaches is couched in terms of decisions made by an external party. This feels problematic to me. There may be some workaround.

* He describes rejecting transitivity as being overwhelmingly implausible. Is that obvious? Ethical considerations ultimately boil down to subjective evaluations, and there seems no obvious reason why those evaluations would be transitive.

62. 0xDEAFBEAD ◴[] No.42187548{6}[source]
More suggestions, if someone wants to create a product that I personally would be enthusiastic to try:

* No weird yeast or fungal ingredients -- living or dead. I get fatigue after consuming that stuff. My body doesn't respond well to it. I suppose the most ordinary sort of mushrooms are probably OK.

* No eggs, they're pretty suffering-intensive: https://sandhoefner.github.io/animal_suffering_calculator

* Minimally processed ingredients preferred.

* Infuse with iodized salt, healthy fats, and spices for flavor. Overall good taste is much more important than replicating meat flavor/mouthfeel. I assume if you iterate on the spices enough, you can come up with something that tastes incredible.

I'm imagining a heavily fortified, compressed bean patty of some kind. I suppose with beans you'd have to account for phytates interfering with nutrient absorption, though.

63. n4r9 ◴[] No.42192690{6}[source]
I'm reminded a video from Olympic weightlifter Clarence Kennedy. He's vegan but has done a lot of research on fuelling himself with the nutrients he needs to maintain a physical ability well beyond most people. This is a man with insane numbers in both weightlifting and powerlifting. And he doesn't supplement with additional protein or creatine powder. To be fair, he's probably also on steroids, but even so you need peak nutrition to reach those numbers.

There's an article here which discusses and embeds the video: https://barbend.com/clarence-kennedy-vegan-diet/