←back to thread

AI 2027

(ai-2027.com)
949 points Tenoke | 2 comments | | HN request time: 0.506s | source
Show context
ivraatiems ◴[] No.43577204[source]
Though I think it is probably mostly science-fiction, this is one of the more chillingly thorough descriptions of potential AGI takeoff scenarios that I've seen. I think part of the problem is that the world you get if you go with the "Slowdown"/somewhat more aligned world is still pretty rough for humans: What's the point of our existence if we have no way to meaningfully contribute to our own world?

I hope we're wrong about a lot of this, and AGI turns out to either be impossible, or much less useful than we think it will be. I hope we end up in a world where humans' value increases, instead of decreasing. At a minimum, if AGI is possible, I hope we can imbue it with ethics that allow it to make decisions that value other sentient life.

Do I think this will actually happen in two years, let alone five or ten or fifty? Not really. I think it is wildly optimistic to assume we can get there from here - where "here" is LLM technology, mostly. But five years ago, I thought the idea of LLMs themselves working as well as they do at speaking conversational English was essentially fiction - so really, anything is possible, or at least worth considering.

"May you live in interesting times" is a curse for a reason.

replies(8): >>43577330 #>>43577995 #>>43578252 #>>43578804 #>>43578889 #>>43580010 #>>43580150 #>>43583543 #
abraxas ◴[] No.43577330[source]
I think LLM or no LLM the emergence of intelligence appears to be closely related to the number of synapses in a network whether a biological or a digital one. If my hypothesis is roughly true it means we are several orders of magnitude away from AGI. At least the kind of AGI that can be embodied in a fully functional robot with the sensory apparatus that rivals the human body. In order to build circuits of this density it's likely to take decades. Most probably transistor based, silicon based substrate can't be pushed that far.
replies(5): >>43577402 #>>43577908 #>>43578032 #>>43578329 #>>43579445 #
baq ◴[] No.43579445[source]
Exponential growth means the first order of magnitude comes slowly and the last one runs past you unexpectedly.
replies(1): >>43579687 #
1. Palmik ◴[] No.43579687[source]
Exponential growth generally means that the time between each order of magnitude is roughly the same.
replies(1): >>43581269 #
2. brookst ◴[] No.43581269[source]
At the risk of pedantry, is that true? Something that doubles annually sure seems like exponential growth to me, but the orders of magnitude are not at all the same rate. Orders of magnitude are a base-10 construct but IMO exponents don’t have to be 10.

EDIT: holy crap I just discovered a commonly known thing about exponents and log. Leaving comment here but it is wrong, or at least naive.