1) No one knows what exactly makes humans "intelligent" and therefore 2) No one knows what it would take to achieve AGI
Go back through history and AI / AGI has been a couple of decades away for several decades now.
1) No one knows what exactly makes humans "intelligent" and therefore 2) No one knows what it would take to achieve AGI
Go back through history and AI / AGI has been a couple of decades away for several decades now.
Aside from that the measure really, to me, has to be power efficiency. If you're boiling oceans to make all this work then you've not achieved anything worth having.
From my calculations the human brain runs on about 400 calories a day. That's an absurdly small amount of energy. This hints at the direction these technologies must move in to be truly competitive with humans.
To me, the Ashley Madison hack in 2015 was 'good enough' for AGI.
No really.
You somehow managed to get real people to chat with bots and pay to do so. Yes, caveats about cheaters apply here, and yes, those bots are incredibly primitive compared to today.
But, really, what else do you want out of the bots? Flying cars, cancer cures, frozen irradiated Mars bunkers? We were mostly getting there already. It'll speed thing up a bit, sure, but mostly just because we can't be arsed to actually fund research anymore. The bots are just making things cheaper, maybe.
No, be real. We wanted cold hard cash out of them. And even those crummy catfish bots back in 2015 were doing the job well enough.
We can debate 'intelligence' until the sun dies out and will still never be satisfied.
But the reality is that we want money, and if you take that low, terrible, and venal standard as the passing bar, then we've been here for a decade.
(oh man, just read that back, I think I need to take a day off here, youch!)
We don't need very powerful AI to do very powerful things.
On the other hand there is a clear mandate for people introducing some different way of doing something to overstate the progress and potentially importance. It creates FOMO so it is simply good marketing which interests potential customers, fans, employees, investors, pundits, and even critics (which is more buzz). And growth companies are immense debt vehicles so creating a sense of FOMO for an increasing pyramid of investors is also valuable for each successive earlier layer. Wish in one hand..
So, about a tenth or less of a single server packed to the top with GPUs.
I think this just displays an exceptionally low estimation of human beings. People tend to resist extremities. Violently.
> experience socially momentous change
The technology is owned and costs money to use. It has extremely limited availability to most of the world. It will be as "socially momentous" as every other first world exclusive invention has been over the past several decades. 3D movies were, for a time, "socially momentous."
> on the verge of self driving cars spreading to more cities.
Lidar can't read street lights and vision systems have all sorts of problems. You might be able to code an agent that can drive a car but you've got some other problems that stand in the way of this. AGI is like 1/8th the battle. I referenced just the brain above. Your eyes and ears are actually insanely powerful instruments in their own right. "Real world agency" is more complicated than people like to admit.
> We don't need very powerful AI to do very powerful things.
You've lost sight of the forest for the trees.
And more practically -- these cars are running in half a dozen cities already. Yes, there's room to go, but pretending there are 'fundamental gaps' to them achieving wider deployment is burying your head in the sand.
It's all very easy to see how that can happen in principle. But turns out actually doing it is a lot harder, and we hit some real hard physical limits. So here we are, still stuck on good ol' earth. Maybe that will change at some point once someone invents an Epstein drive or Warp drive or whatever, but you can't really predict when inventions happen, if ever, so ... who knows.
Similarly, it's not my impression that AGI is simply a matter of "the current tech, but a bit better". But who knows what will happen or what new thing someone may or may not invent.
With AGI, as far as I know, no one has a good conceptual model of what a functional AGI even looks like. LLM is all the rage now, but we don't even know if it's a stepping stone to get to AGI.
He's_Outta_Line_But_He's_Right.gif
Seriously, AGI to the HN crowd is not the same as AGI to the average human. To my parents, these bots must look like fucking magic. They can converse with them, "learn" new things, talk to a computer like they'd talk to a person and get a response back. Then again, these are also people who rely on me for basic technology troubleshooting stuff, so I know that most of this stuff is magic to their eyes.
That's the problem, as you point out. We're debating a nebulous concept ("intelligence") that's been co-opted by marketers to pump and dump the latest fad tech that's yet to really display significant ROI to anyone except the hypesters and boosters, and isn't rooted in medical, psychological, or societal understanding of the term anymore. A plurality of people are ascribing "intelligence" to spicy autocorrect, worshiping stochastic parrots vomiting markov chains but now with larger context windows and GPUs to crunch larger matrices, powered by fossil fuels and cooled by dwindling freshwater supplies, and trained on the sum total output of humanity but without compensation to anyone who actually made the shit in the first place.
So yeah. You're dead-on. It's just about bilking folks out of more money they already don't have.
And Ashley Madison could already to that for pennies on the dollar compared to LLMs. They just couldn't "write code" well enough to "replace" software devs.
They are pretty good at muscle memory style intelligence though.
Only in a symbolic way. Money is just debt. It doesn't mean anything if you can't call the loan and get back what you are owed. On the surface, that means stuff like food, shelter, cars, vacations, etc. But beyond the surface, what we really want is other people who will do anything we please. Power, as we often call it. AGI is, to some, seen as the way to give them "power".
But, you are right, the human fundamentally can never be satisfied. Even if AGI delivers on every single one of our wildest dreams, we'll adapt, it will become normal, and then it will no longer be good enough.
While it may be impossible to measure looking towards the future, in hindsight we will be able to recognize it.
So does a drone show to an uncontacted tribe. So does a card trick to a chimpanzee (there are videos of them freaking out when a card disappears).
That's not an argument for or against anything.
I propose this:
"AGI is a self-optimizing artificial organism that can solve 99% of all the humanity's problems."
See, it's not a bad definition IMO. Find me one NS-5 from the "I, Robot" movie that also has access to all science and all internet and all history and can network with the others and fix our cities, nature, manufacturing, social issues and a few others, just in a decade or two. Then we have AGI.
Comparing to what was there 10 years ago and patting ourselves on the back about how far we have gotten is being complacent.
Let's never be complacent.
Yes, and? A good Litmus test about which humans are, shall we say, not welcome in this new society.
There are plenty of us out there that have fixed our upper limits of wealth and we don't want more, and we have proven it during our lives.
F.ex. people get 5x more but it comes with 20x more responsibility, they burn out, get back to a job that's good enough and not stressful and pays everything they need from life, settle there, never change it.
Let's not judge humanity at large by a handful of psychopaths that would overdose and die at 22 years old if given the chance. Please.
And no, before you say it: no, I'll never get to the point where "it's never enough" and no, I am not deluding myself. Nope.
FYI, the reactions in those videos is most likely not to a cool magic trick, but rather a response to an observed threat. Could be the person filming/performing smiling (showing teeth), or someone behind the camera purposely startling it at the "right" moment.
Imagine an AI, which is millions of times smarter than humans in physics, math, chemistry, biology, can invent new materials, ways to produce energy, will make super decisions. It would be amazing and it would transform life on Earth. This is ASI, even if in some obscure test (strawberry test) is just can't reach human level and therefore can't be called proper AGI.
Airplanes are way (tens, thousands+) above birds in development (speed, distance, carrying capacity). They are superior to birds despite not being able to fully replicate birds' bone structure, feathers, biology and ability to poop.
Some people are definitely like this, but I think it is dangerous to generalize to everyone -- it is too easy to assume that everyone is the same, especially if you can dismiss any disagreement as "they are just hypocritical about their true desires" (in other words, if your theory is unfalsifiable).
There are also people who incorrectly believe that everyone's deepest desire is to help others, and they too need to learn that they are wrong when they generalize.
I guess the truth is: different people are different.
> But, you are right, the human fundamentally can never be satisfied
That's usually associated with "they want more and more". If you feel that's wrong then just correct me and move any argument forward. Telegraphic replies are not an interesting discussion format.
I was commenting on what I'm observing in most people I've met. But yeah, I'll agree I'm venturing into the clouds now and the discussion will become strictly theoretical and thus fruitless. Fair enough.
Thanks for indulging. :) Was interesting to hear takes so very different than mine.
Plus it's a massive prediction machine trained on a corpus of the bulk of human knowledge.
Feels weird to see it minimized in that way.