←back to thread

174 points Philpax | 2 comments | | HN request time: 0.434s | source
Show context
EliRivers ◴[] No.43719892[source]
Would we even recognise it if it arrived? We'd recognise human level intelligence, probably, but that's specialised. What would general intelligence even look like.
replies(8): >>43719970 #>>43719984 #>>43720087 #>>43720130 #>>43720153 #>>43720195 #>>43720300 #>>43725034 #
shmatt ◴[] No.43719970[source]
We sort of are able to recognize Nobel-worthy breakthroughs

One of the many definitions I have for AGI is being able to create the proofs for the 2030, 2050, 2100, etc Nobel Prizes, today

A sillier one I like is that AGI would output a correct proof that P ≠ NP on day 1

replies(1): >>43719982 #
tough ◴[] No.43719982[source]
Isn't AGI just "general" intelligence as in -like a regular human- turing test kinda deal?

aren't you thinking about ASI/ Superintelligence way capable of outdoing humans?

replies(1): >>43720028 #
kadushka ◴[] No.43720028[source]
Yes, a general consensus is AGI should be able to perform any task an average human is able to perform. Definitely nothing of Nobel prize level.
replies(2): >>43720059 #>>43720067 #
EliRivers ◴[] No.43720059[source]
A bit poorly named; not really very general. AHI would be a better name.
replies(2): >>43720136 #>>43721488 #
1. kadushka ◴[] No.43720136[source]
Another general consensus is that humans possess general intelligence.
replies(1): >>43720211 #
2. EliRivers ◴[] No.43720211[source]
Yes, we do seem to have a very high opinion of ourselves.