Then calculators are AGI. A program that does fizz-buzz is AGI. Way to make AGI a meaningless term. What LLMs do now is so far from AGI that I don't know how people make any connection between it and AGI.
This is what AGI means (or should mean): Generalized understanding of the world. Basically, with AGI the context window would be the something like the entire knowledge and understanding of the world that an (adult?) person has (e.g., physics intuition), coupled with the ability to actually reason and act on it, update it, reflect on it, etc.
A small slice of this (e.g., less knowledge than a typical adult) would still be AGI, but current AIs:
- Cannot continually learn and incorporate that learning into their model.
- Cannot reason on any deep level. And before anyone claims that the pattern matching they do is all we do, no this is not the case. Even strong pattern-matching/AI chess engines have weak spots that betray the fact that they do not actually reason like humans do.
- Cannot engage in unprompted reflection in the background.
Current AIs are like a hologram; we are mistaking the 1- or 2-dimensional responses to queries for a deep higher dimensional understanding humans have. The incredible thing about human consciousness is the the deep (infinite?) interiority of it. I can reason about reasoning. I can reason about reasoning about my reasoning, etc. I can reflect on my consciousness. I can reflect on reflecting on my consciousness, etc.
Machines are nowhere close this, and likely will never be