←back to thread

AI 2027

(ai-2027.com)
949 points Tenoke | 1 comments | | HN request time: 0.252s | source
Show context
stego-tech ◴[] No.43578594[source]
It’s good science fiction, I’ll give it that. I think getting lost in the weeds over technicalities ignores the crux of the narrative: even if this doesn’t lead to AGI, at the very least it’s likely the final “warning shot” we’ll get before it’s suddenly and irreversibly here.

The problems it raises - alignment, geopolitics, lack of societal safeguards - are all real, and happening now (just replace “AGI” with “corporations”, and voila, you have a story about the climate crisis and regulatory capture). We should be solving these problems before AGI or job-replacing AI becomes commonplace, lest we run the very real risk of societal collapse or species extinction.

The point of these stories is to incite alarm, because they’re trying to provoke proactive responses while time is on our side, instead of trusting self-interested individuals in times of great crisis.

replies(10): >>43578747 #>>43579251 #>>43579927 #>>43580364 #>>43580681 #>>43581002 #>>43581238 #>>43581588 #>>43581940 #>>43582040 #
YetAnotherNick ◴[] No.43580364[source]
> very real risk of societal collapse or species extinction

No, there is no risk of species extinction in the near future due to climate change and repeating the line will just further the divide and make the people not care about other people's and even real climate scientist's words.

replies(2): >>43580459 #>>43580658 #
Aeolun ◴[] No.43580459[source]
Don’t say the things people don’t want to hear and everything will be fine?

That sounds like the height of folly.

replies(1): >>43586674 #
1. YetAnotherNick ◴[] No.43586674[source]
Don't say false things. Especially if it is political and there isn't any way to debate it.