←back to thread

265 points ctoth | 1 comments | | HN request time: 0.211s | source
Show context
simonw ◴[] No.43745125[source]
Coining "Jagged AGI" to work around the fact that nobody agrees on a definition for AGI is a clever piece of writing:

> In some tasks, AI is unreliable. In others, it is superhuman. You could, of course, say the same thing about calculators, but it is also clear that AI is different. It is already demonstrating general capabilities and performing a wide range of intellectual tasks, including those that it is not specifically trained on. Does that mean that o3 and Gemini 2.5 are AGI? Given the definitional problems, I really don’t know, but I do think they can be credibly seen as a form of “Jagged AGI” - superhuman in enough areas to result in real changes to how we work and live, but also unreliable enough that human expertise is often needed to figure out where AI works and where it doesn’t.

replies(4): >>43745268 #>>43745321 #>>43745426 #>>43746223 #
verdverm ◴[] No.43745321[source]
Why not call it AJI instead of AGI then?

Certainly jagged does not imply general

It seems to me the bar for "AGI" has been lowered to measuring what tasks it can do rather than the traits we normally associate with general intelligence. People want it to be here so bad they nerf the requirements...

replies(4): >>43745364 #>>43745367 #>>43746244 #>>43756424 #
nearbuy ◴[] No.43746244[source]
Human intelligence is jagged. You're raising the AGI bar to a point where most people wouldn't qualify as having general intelligence.

My partner and I work in different fields. AI has advanced to the point where there are very few questions I could ask my partner that o3 couldn't answer as well or better.

I can't ask expert level questions in her field, because I'm not an expert in her field, and she couldn't ask expert level questions in my field for the same reason. So when we're communicating with each other, we're mostly at sub-o3 level.

> People want it to be here so bad they nerf the requirements...

People want to claim it's overhyped (and protect their own egos) so badly they raise the requirements...

But really, largely people just have different ideas of what AGI is supposed to mean. It used to vaguely mean "human-level intelligence", which was fine for talking about some theoretical future event. Now we're at a point where that definition is too vague to say whether AI meets it.

replies(2): >>43746384 #>>43746407 #
1. tasuki ◴[] No.43746384[source]
> You're raising the AGI bar to a point where most people wouldn't qualify as having general intelligence.

We kind of don't? Look how difficult it is for us to just understand some basic math. Us humans mostly have intelligence related to the ancestral environment we developed in, nothing general about that.

I agree with you the term "AGI" is rather void of meaning these days...