←back to thread

AI 2027

(ai-2027.com)
949 points Tenoke | 1 comments | | HN request time: 0s | source
Show context
zvitiate ◴[] No.43572810[source]
There's a lot to potentially unpack here, but idk, the idea that humanity entering hell (extermination) or heaven (brain uploading; aging cure) is whether or not we listen to AI safety researchers for a few months makes me question whether it's really worth unpacking.
replies(2): >>43572822 #>>43576538 #
amelius ◴[] No.43572822[source]
If we don't do it, someone else will.
replies(4): >>43572955 #>>43573020 #>>43575752 #>>43575927 #
achierius ◴[] No.43575927[source]
That's obviously not true. Before OpenAI blew the field open, multiple labs -- e.g. Google -- were intentionally holding back their research from the public eye because they thought the world was not ready. Investors were not pouring billions into capabilities. China did not particularly care to focus on this one research area, among many, that the US is still solidly ahead in.

The only reason timelines are as short as they are is because of people at OpenAI and thereafter Anthropic deciding that "they had no choice". They had a choice, and they took the one which has chopped at the very least years off of the time we would otherwise have had to handle all of this. I can barely begin to describe the magnitude of the crime that they have committed -- and so I suggest that you consider that before propagating the same destructive lies that led us here in the first place.

replies(1): >>43576062 #
pixl97 ◴[] No.43576062{3}[source]
The simplicity of the statement "If we don't do it, someone else will." and thinking behind it eventually means someone will do just that unless otherwise prevented by some regulatory function.

Simply put, with the ever increasing hardware speeds we were dumping out for other purposes this day would have come sooner than later. We're talking about only a year or two really.

replies(2): >>43578076 #>>43578418 #
1. HeatrayEnjoyer ◴[] No.43578076{4}[source]
Cloning? Bioweapons? Ever larger nuclear stockpiles? The world has collectively agreed not to do something more than once. AI would be easier to control than any of the above. GPUs can't be dug out of the ground.