←back to thread

334 points mooreds | 2 comments | | HN request time: 0.603s | source
Show context
izzydata ◴[] No.44484180[source]
Not only do I not think it is right around the corner. I'm not even convinced it is even possible or at the very least I don't think it is possible using conventional computer hardware. I don't think being able to regurgitate information in an understandable form is even an adequate or useful measurement of intelligence. If we ever crack artificial intelligence it's highly possible that in its first form it is of very low intelligence by humans standards, but is truly capable of learning on its own without extra help.
replies(10): >>44484210 #>>44484226 #>>44484229 #>>44484355 #>>44484381 #>>44484384 #>>44484386 #>>44484439 #>>44484454 #>>44484478 #
baxtr ◴[] No.44484384[source]
I think the same.

How do you call people like us? AI doomers? AI boomers?!

replies(3): >>44484414 #>>44484467 #>>44484497 #
1. npteljes ◴[] No.44484467[source]
"AI skeptics", like here: https://www.techopedia.com/the-skeptics-who-believe-ai-is-a-...
replies(1): >>44485290 #
2. izzydata ◴[] No.44485290[source]
This article is about being skeptical that what people currently call AI that is actually LLMs is going to be a transformative technology.

Myself and many others are skeptical that LLMs are even AI.

LLMs / "AI" may very well be a transformative technology that changes the world forever. But that is a different matter.