←back to thread

168 points julienchastang | 3 comments | | HN request time: 0.645s | source
Show context
jmyeet ◴[] No.43712152[source]
This is your daily reminder that if we do indeed discover life so close (~120 LY) to Earth, it's an incredibly bad sign for us. This is an exercise in Bayesian reasoning.

Imagine there are 2 planets in the Milky Way where life has developed. The odds are incredibly low they're next to each other, assuming a random distribution. So it's way more likely that there are more than 2. Imagine a sphere of radius 60 LY (120/2). Our Earth is the center of one. This planet is another. That's a volume of 10^6 LY^3. The Milky Way volume (from Google) is ~17T LY^3 so there'd be roughly 170M such spheres in our galaxy.

Now imagine if the odds of simple life becoming intelligent life that we could detect and could become spacefaring is 1 in 1 million. There'd be ~170 such civilization in the Milk Way.

We have absolutely no evidence of this So simple life is a lot less common, intelligent life is a lot less likely or, and this is the scary part, something tends to wipe out sentient civilizations and that's likely in our future.

In Fermi Paradox terms, we call this a Great Filter.

replies(7): >>43712551 #>>43712584 #>>43712593 #>>43712957 #>>43713028 #>>43713560 #>>43714268 #
AIPedant ◴[] No.43713028[source]

  Now imagine if the odds of simple life becoming intelligent life that we could detect and could become spacefaring is 1 in 1 million.
Why not assume the odds are 1 in 200 million, so that there's only one such planet in the whole galaxy and there's no Great Filter to worry about? It seems just as valid, except "200 million" is slightly less aesthetically pleasing.

We already know that human-type technological civilization is extremely unlikely, since the planet was full of somewhat intelligent bipedal dinosaurs for about 100m years, yet none of them seemed to engage in mining, large-scale construction, severely disruptive extinctions, nuclear power... (humans will leave a thick geological layer of concrete and pollution, along with plentiful unique minerals coming from slag, plastic, etc.)

I really have no patience for this sort of p(doom) nonsense. If you make up numbers you can come to any quantitative conclusion you want. Mildly tweaking totally unfalsifiable odds makes the Fermi "paradox" go away entirely.

replies(1): >>43713245 #
vimax ◴[] No.43713245[source]
Anything that acts to lower the odds by 200x so that we're the only likely intelligence in the galaxy is the definition of a "great filter".
replies(1): >>43718947 #
1. aradox66 ◴[] No.43718947[source]
That's not right, the great filter comes after the development of intelligence
replies(1): >>43720474 #
2. awb ◴[] No.43720474[source]
A Great Filter could come before or after intelligence.

For example, predation could be a Great Filter. We made it past the threat of predation, but perhaps most life forms in the universe don’t. In this example, life could be pervasive in the universe but it’s optimizing for defensive attributes like speed and armor rather than intelligence. That could explain why we don’t see aliens colonizing the galaxy. With the Great Filter behind us, nothing would be stopping us from colonizing the galaxy ourselves, we just might be among the first to do it.

Putting the Great Filter in front of us postulates that many life forms have achieved our same accomplishments but haven’t colonized the galaxy. It could be because of doom and gloom scenarios like war, aliens, cosmic catastrophes, etc. Or it could be a beneficial Great Filter like enlightenment and the lack of desire to propagate and consume endlessly, or the ability to survive in deep space, leaving stars and planets untouched.

replies(1): >>43740539 #
3. kulahan ◴[] No.43740539[source]
That last bit is the part I always think about. You could easily hide an entire civilization in space, and there’s a lot more “space” than “planet”