The whole idea of morality is totally arbitrary and only exists because we say it does. If you were an Aztec then sacrificing people to Huitzilopochtli was actually moral, and if you were alive two thousand years ago nobody would have batted an eye at owning slaves.
In reality, nobody would actually choose to save the life of 34 crustaceans over the life of a human, even if killing the prawn results in 102% of the suffering of killing the human.
It's the same with all the EA stuff like prioritising X trillion potential humans that may exist in the future over actual people who exist now - you can get as granular as you want and mess around with probability to say anything. Maybe it's good to grow brains in vats and feed them heroin - that'll increase total happiness! Maybe we should judge someone who has killed enough flies the same as a murderer! Maybe our goals for the future should be based on the quadrillion future sentient creatures that will evolve from today's coconut crabs!