←back to thread

174 points Philpax | 2 comments | | HN request time: 0.001s | source
Show context
stared ◴[] No.43722458[source]
My pet peeve: talking about AGI without defining it. There’s no consistent, universally accepted definition. Without that, the discussion may be intellectually entertaining—but ultimately moot.

And we run into the motte-and-bailey fallacy: at one moment, AGI refers to something known to be mathematically impossible (e.g., due to the No Free Lunch theorem); the next, it’s something we already have with GPT-4 (which, while clearly not superintelligent, is general enough to approach novel problems beyond simple image classification).

There are two reasonable approaches in such cases. One is to clearly define what we mean by the term. The second (IMHO, much more fruitful) is to taboo your words (https://www.lesswrong.com/posts/WBdvyyHLdxZSAMmoz/taboo-your...)—that is, avoid vague terms like AGI (or even AI!) and instead use something more concrete. For example: “When will it outperform 90% of software engineers at writing code?” or “When will all AI development be in hands on AI?”.

replies(3): >>43722582 #>>43723139 #>>43728389 #
biophysboy ◴[] No.43723139[source]
I like chollet's definition: something that can quickly learn any skill without any innate prior knowledge or training.
replies(2): >>43723263 #>>43744160 #
1. stared ◴[] No.43744160[source]
I like Chollet's line of thinking.

Yet, if you take "any" literally, the answer is simple - there will never be one. Not even for practical reasons, but closer to why there isn't "a set of all sets".

Picking a sensible benchmark is the hard part.

replies(1): >>43792790 #
2. biophysboy ◴[] No.43792790[source]
I think its more of a measurable quantity than a intelligent/non-intelligent threshold binary. Chollet literally made a paper defining it as something like (skill*generalization)/(experience+priors). I don't think its a flawless model, but also I didn't create keras