←back to thread

334 points mooreds | 1 comments | | HN request time: 0s | source
Show context
izzydata ◴[] No.44484180[source]
Not only do I not think it is right around the corner. I'm not even convinced it is even possible or at the very least I don't think it is possible using conventional computer hardware. I don't think being able to regurgitate information in an understandable form is even an adequate or useful measurement of intelligence. If we ever crack artificial intelligence it's highly possible that in its first form it is of very low intelligence by humans standards, but is truly capable of learning on its own without extra help.
replies(10): >>44484210 #>>44484226 #>>44484229 #>>44484355 #>>44484381 #>>44484384 #>>44484386 #>>44484439 #>>44484454 #>>44484478 #
Waterluvian ◴[] No.44484386[source]
I think the only way that it’s actually impossible is if we believe that there’s something magical and fundamentally immeasurable about humans that leads to our general intelligence. Otherwise we’re just machines, after all. A human brain is theoretically reproducible outside standard biological mechanisms, if you have a good enough nanolathe.

Maybe our first AGI is just a Petri dish brain with a half-decent python API. Maybe it’s more sand-based, though.

replies(8): >>44484413 #>>44484436 #>>44484490 #>>44484539 #>>44484739 #>>44484759 #>>44485168 #>>44487032 #
1. josefx ◴[] No.44484739[source]
> and fundamentally immeasurable about humans that leads to our general intelligence

Isn't AGI defined to mean "matches humans in virtually all fields"? I don't think there is a single human capable of this.