←back to thread

174 points Philpax | 1 comments | | HN request time: 0s | source
Show context
EliRivers ◴[] No.43719892[source]
Would we even recognise it if it arrived? We'd recognise human level intelligence, probably, but that's specialised. What would general intelligence even look like.
replies(8): >>43719970 #>>43719984 #>>43720087 #>>43720130 #>>43720153 #>>43720195 #>>43720300 #>>43725034 #
logicchains ◴[] No.43720087[source]
AGI isn't ASI; it's not supposed to be smarter than humans. The people who say AGI is far away are unscientific woo-mongers, because they never give a concrete, empirically measurable definition of AGI. The closest we have is Humanity's Last Exam, which LLMs are already well on the path to acing.
replies(2): >>43720162 #>>43720319 #
1. EliRivers ◴[] No.43720162[source]
I'd expect it to be generalised, where we (and everything else we've ever met) are specialised. Our intelligence is shaped by our biology and our environment; the limitations on our thinking are themselves concepts the best of us can barely glimpse. Some kind of intelligence that inherently transcends its substrate.

What that would look like, how it would think, the kind of mental considerations it would have, I do not know. I do suspect that declaring something that thinks like us would have "general intelligence" to be a symptom of our limited thinking.