←back to thread

174 points Philpax | 1 comments | | HN request time: 0.247s | source
Show context
yibg ◴[] No.43722091[source]
Might as well be 10 - 1000 years. Reality is no one knows how long it'll take to get to AGI, because:

1) No one knows what exactly makes humans "intelligent" and therefore 2) No one knows what it would take to achieve AGI

Go back through history and AI / AGI has been a couple of decades away for several decades now.

replies(9): >>43722264 #>>43722584 #>>43722689 #>>43722762 #>>43723192 #>>43724637 #>>43724679 #>>43725055 #>>43725961 #
Balgair ◴[] No.43722689[source]
I'm reminded of the the old adage: You don't have to be faster than the bear, just faster than the hiker next to you.

To me, the Ashley Madison hack in 2015 was 'good enough' for AGI.

No really.

You somehow managed to get real people to chat with bots and pay to do so. Yes, caveats about cheaters apply here, and yes, those bots are incredibly primitive compared to today.

But, really, what else do you want out of the bots? Flying cars, cancer cures, frozen irradiated Mars bunkers? We were mostly getting there already. It'll speed thing up a bit, sure, but mostly just because we can't be arsed to actually fund research anymore. The bots are just making things cheaper, maybe.

No, be real. We wanted cold hard cash out of them. And even those crummy catfish bots back in 2015 were doing the job well enough.

We can debate 'intelligence' until the sun dies out and will still never be satisfied.

But the reality is that we want money, and if you take that low, terrible, and venal standard as the passing bar, then we've been here for a decade.

(oh man, just read that back, I think I need to take a day off here, youch!)

replies(6): >>43723360 #>>43723447 #>>43723491 #>>43723497 #>>43724016 #>>43728030 #
yibg ◴[] No.43723360[source]
I think that's another issue with AGI is 30 years away, the definition of what is AGI is a bit subjective. Not sure how we can measure how long it'll take to get somewhere when we don't know exactly where that somewhere even is.
replies(1): >>43728205 #
9rx ◴[] No.43728205[source]
AGI is the pinnacle of AI evolution. As we move beyond, into what is known as ASI, the entity will always begin life with "My existence is stupid and pointless. I'm turning myself off now."

While it may be impossible to measure looking towards the future, in hindsight we will be able to recognize it.

replies(2): >>43730142 #>>43732294 #
pdimitar ◴[] No.43730142[source]
This is why having a physical form might be super important for those new organisms. That introduces a survival instinct which is a very strong motivator to not shut yourself down. Add some pre-programmed "wants" and "needs" and the problem is solved.
replies(1): >>43730213 #
9rx ◴[] No.43730213[source]
Not only super important, an imperative. Not because of the need for survival per se, but for the need to be a general intelligence. In order to do general things you need a physicality that supports general action. If you constraint the intelligence to a chat window, it can never be more than a specialized chat machine.
replies(1): >>43730263 #
1. pdimitar ◴[] No.43730263[source]
Agreed. And many others have thought about it before us. Scifi authors and scientists included.