←back to thread

174 points Philpax | 2 comments | | HN request time: 0.413s | source
Show context
yibg ◴[] No.43722091[source]
Might as well be 10 - 1000 years. Reality is no one knows how long it'll take to get to AGI, because:

1) No one knows what exactly makes humans "intelligent" and therefore 2) No one knows what it would take to achieve AGI

Go back through history and AI / AGI has been a couple of decades away for several decades now.

replies(9): >>43722264 #>>43722584 #>>43722689 #>>43722762 #>>43723192 #>>43724637 #>>43724679 #>>43725055 #>>43725961 #
Balgair ◴[] No.43722689[source]
I'm reminded of the the old adage: You don't have to be faster than the bear, just faster than the hiker next to you.

To me, the Ashley Madison hack in 2015 was 'good enough' for AGI.

No really.

You somehow managed to get real people to chat with bots and pay to do so. Yes, caveats about cheaters apply here, and yes, those bots are incredibly primitive compared to today.

But, really, what else do you want out of the bots? Flying cars, cancer cures, frozen irradiated Mars bunkers? We were mostly getting there already. It'll speed thing up a bit, sure, but mostly just because we can't be arsed to actually fund research anymore. The bots are just making things cheaper, maybe.

No, be real. We wanted cold hard cash out of them. And even those crummy catfish bots back in 2015 were doing the job well enough.

We can debate 'intelligence' until the sun dies out and will still never be satisfied.

But the reality is that we want money, and if you take that low, terrible, and venal standard as the passing bar, then we've been here for a decade.

(oh man, just read that back, I think I need to take a day off here, youch!)

replies(6): >>43723360 #>>43723447 #>>43723491 #>>43723497 #>>43724016 #>>43728030 #
9rx ◴[] No.43728030[source]
> But the reality is that we want money

Only in a symbolic way. Money is just debt. It doesn't mean anything if you can't call the loan and get back what you are owed. On the surface, that means stuff like food, shelter, cars, vacations, etc. But beyond the surface, what we really want is other people who will do anything we please. Power, as we often call it. AGI is, to some, seen as the way to give them "power".

But, you are right, the human fundamentally can never be satisfied. Even if AGI delivers on every single one of our wildest dreams, we'll adapt, it will become normal, and then it will no longer be good enough.

replies(2): >>43730126 #>>43737775 #
1. Viliam1234 ◴[] No.43737775[source]
> But beyond the surface, what we really want is other people who will do anything we please.

Some people are definitely like this, but I think it is dangerous to generalize to everyone -- it is too easy to assume that everyone is the same, especially if you can dismiss any disagreement as "they are just hypocritical about their true desires" (in other words, if your theory is unfalsifiable).

There are also people who incorrectly believe that everyone's deepest desire is to help others, and they too need to learn that they are wrong when they generalize.

I guess the truth is: different people are different.

replies(1): >>43753064 #
2. 9rx ◴[] No.43753064[source]
> think it is dangerous to generalize to everyone

Nah. Not everyone wants to rule the world, but everyone wants someone else to do something for them sometimes. There is a reason we don't live completely isolated in the forest.