←back to thread

265 points ctoth | 1 comments | | HN request time: 0.202s | source
Show context
logicchains ◴[] No.43745171[source]
I'd argue that it's not productive to use any definition of AGI coined after 2020, to avoid the fallacy of shifting the goalposts.
replies(2): >>43745346 #>>43746649 #
1. TheAceOfHearts ◴[] No.43746649[source]
I really dislike this framing. Historically we've been very confused about what AGI means because we don't actually understand it. We're still confused so most working definitions have been iterated upon as models acquire new capabilities. It's akin to searching something in the fog of war: you set a course or destination because you think that's the approximate direction where the thing will be found, but then you get there and realize you were wrong so you continue exploring.

Most people have a rough idea of what AGI means, but we still haven't figured out an exact definition that lines up with reality. As we continue exploring the idea space, we'll keep figuring out which parameters place boundaries and requirements on what AGI means.

There's no reason to just accept an ancient definition from someone who was confused and didn't know any better at the time when they invented their definition. Older definitions were just shots in the dark that pointed in a general direction, but there's no guarantee that they would hit upon the exact destination.