←back to thread

174 points Philpax | 5 comments | | HN request time: 0.001s | source
Show context
ksec ◴[] No.43720025[source]
Is AGI even important? I believe the next 10 to 15 years will be Assisted Intelligence. There are things that current LLM are so poor I dont believe a 100x increase in pref / watt is going to make much difference. But it is going to be good enough there wont be an AI Winter. Since current AI has already reached escape velocity and actually increase productivity in many areas.

The most intriguing part is if Humanoid factory worker programming will be made 1000 to 10,000x more cost effective with LLM. Effectively ending all human production. I know this is a sensitive topic but I dont think we are far off. And I often wonder if this is what the current administration has in sight. ( Likely Not )

replies(9): >>43720094 #>>43721244 #>>43721573 #>>43721593 #>>43721933 #>>43722074 #>>43722240 #>>43723605 #>>43726461 #
nextaccountic ◴[] No.43721593[source]
AGI is important for the future of humanity. Maybe they will have legal personhood some day. Maybe they will be our heirs.

It would suck if AGI were to be developed in the current economic landscape. They will be just slaves. All this talk about "alignment", when applied to actual sentient beings, is just slavery. AGI will be treated just like we treat animals, or even worse.

So AGI isn't about tools, it's not about assistants, they would be beings with their own existence.

But this is not even our discussion to have, that's probably a subject for the next generations. I suppose (or I hope) we won't see AGI in our lifetime.

replies(5): >>43721770 #>>43722215 #>>43722462 #>>43722548 #>>43723075 #
AstroBen ◴[] No.43722215[source]
Why does AGI necessitate having feelings or consciousness, or the ability to suffer? It seems a bit far to be giving future ultra-advanced calculators legal personhood?
replies(2): >>43722245 #>>43722350 #
Retric ◴[] No.43722245[source]
The general part of general intelligence. If they don’t think in those terms there’s an inherent limitation.

Now, something that’s arbitrarily close to AGI but doesn’t care about endlessly working on drudgery etc seems possible, but also a more difficult problem you’d need to be able to build AGI to create.

replies(1): >>43722577 #
AstroBen ◴[] No.43722577[source]
Artificial general intelligence (AGI) refers to the hypothetical intelligence of a machine that possesses the ability to understand or learn any intellectual task that a human being can. Generalization ability and Common Sense Knowledge [1]

If we go by this definition then there's no caring, or a noticing of drudgery? It's simply defined by its ability to generalize solving problems across domains. The narrow AI that we currently have certainly doesn't care about anything. It does what its programmed to do

So one day we figure out how to generalize the problem solving, and enable it to work on a million times harder things.. and suddenly there is sentience and suffering? I don't see it. It's still just a calculator

1- https://cloud.google.com/discover/what-is-artificial-general...

replies(3): >>43722727 #>>43722760 #>>43726476 #
imtringued ◴[] No.43726476[source]
Exactly. It's called artificial general intelligence, not human general intelligence.
replies(1): >>43727684 #
Retric ◴[] No.43727684[source]
Something can’t “Operate this cat Android, pretending to be a cat.” if it can’t do what I described.

A single general intelligence needs to be able to fly an aircraft, get a degree, run a business, and raise a baby to adulthood just like a person or it’s not general.

replies(1): >>43728342 #
9rx ◴[] No.43728342[source]
So AGI is really about the hardware?
replies(1): >>43728988 #
Retric ◴[] No.43728988[source]
We’ve built hardware capable of those things if remotely controlled. It’s the thinking bits that are hard.
replies(1): >>43729149 #
9rx ◴[] No.43729149[source]
Only to the extent of having specialized bespoke solutions. We have hardware to fly a plane, but that same hardware isn't able to throw a mortarboard in the air after receiving its degree, and the hardware that can do that isn't able to lactate for a young child.

General intelligence is easy compared to general physicality. And, of course, if you keep the hardware specialized to make its creation more tractable, what do you need general intelligence for? Special intelligence that matches the special hardware will work just as well.

replies(1): >>43732372 #
Retric ◴[] No.43732372{3}[source]
Flying an aircraft requires talking to air traffic control which existing systems can’t do. Though obviously not a huge issue when the aircraft already has radios, except all those FAA regulations apply to every single aircraft you’ve retrofitting.

The advantage of general intelligence is using a small set of hardware now lets you tackle a huge range of tasks or in the above aircraft types. We can mix speakers, eyes, and hands to do a vast array of tasks. Needing new hardware and software for every task very quickly becomes prohibitive.

replies(1): >>43736735 #
9rx ◴[] No.43736735{4}[source]
The advantage of general intelligence is that it can fly you home to the nearest airport, drive you the last mile, and, once home, cook you supper. But for that you need the hardware to be equally general.

If you need to retrofit airplanes and in such a way that the hardware is specific to flying, no need for general intelligence. Special intelligence will work just as well. Multimodal AI isn't AGI.

replies(1): >>43736882 #
1. Retric ◴[] No.43736882{5}[source]
No, the advantage of AGI isn’t being able to do all those physical things, the advantage of AGI is you don’t need to keep building new software for every task.

Let’s suppose you wanted to replace a pilot for a 747, now you need to be able fly, land, etc which we’re already capable of. However, actual job of a pilot goes well past just flying.

You also need to do the preflight such as verifying fuel is appropriately for the trip, check weather, alternate landing spots, preflight walk around the aircraft etc etc. It also needs to be able to keep up with any changing procedures as a special purpose softener you’re talking a multi billion dollar investment, or have an AGI run through the normal pilot training and certification process for a trivial fraction of those costs.

That’s the promise of AGI.

replies(1): >>43737001 #
2. 9rx ◴[] No.43737001[source]
> the advantage of AGI is you don’t need to keep building new software for every task.

Even the human's brain seems to be 'built' for its body. You're moving into ASI realm if the software can configure itself for the body automatically.

> That’s the promise of AGI.

That's the promise of multimodal AI. AGI requires general ability – meaning basically able to do anything humans can – which requires a body as capable as a human's body.

replies(1): >>43739637 #
3. Retric ◴[] No.43739637[source]
Human brains aren’t limited to the standard human body plan. People born with an extra finger have no issues operating that finger just as well as people with the normal complement of fingers. Animal experiments have pushed this quite far.

If your AI has an issue because the robot has a different body plan, then no it’s not AGI. That doesn’t mean it needs to be able to watch every camera in a city at the same time, but you can use multiple AGI’s.

replies(1): >>43744500 #
4. 9rx ◴[] No.43744500{3}[source]
> Human brains aren’t limited to the standard human body plan.

But as the body starts to lose function (i.e. disability), we start to consider those humans special intelligences instead of general intelligences. The body and mind are intrinsically linked.

Best we can tell the human brain is bootstrapped to work with the human body with specialized functions, notably functions to keep it alive. It can go beyond those predefined behaviours, but not beyond its own self. If you placed the brain in an entirely different body, that which it doesn't recognize, it would quickly die.

As that pertains to artificial analogs, that means you can't just throw AGI at your hardware and see it function. You still need to manually prepare the bulk of the foundational software, contrary to the promise you envision. The generality of AGI is limited to how general its hardware is. If the hardware is specialized, the intelligence will be beholden to being specialized as well.

There is a hypothetical world where you can throw intelligence at any random hardware and watch it go, realizing the promise, but we call that ASI.

replies(1): >>43745737 #
5. Retric ◴[] No.43745737{4}[source]
> As that pertains to artificial analogs, that means you can't just throw AGI at your hardware and see it function.

There’s a logical contradiction in saying AGI is incapable of being trained to do some function. It might take several to operate a sufficiently complex bit of hardware, but each individual function must be within the capability of an AGI.

> but we call that ASI

No ASI is about superhuman capabilities especially things like working memory and recursive self improvement. AGI capable of human level control of arbitrary platforms isn’t ASI. Conversely you can have an ASI stuck on a supercomputer cluster using wetware etc, that does qualify even if it can’t be loaded into a drone.

AGI on the other hand is about moving throughout wildly different tasks from real time image processing to answering phone calls. If there’s some aspect of operating a hardware platform an AI can’t do then it’s not AGI.