←back to thread

174 points Philpax | 2 comments | | HN request time: 0.4s | source
Show context
dcchambers ◴[] No.43720006[source]
And in 30 years it will be another 30 years away.

LLMs are so incredibly useful and powerful but they will NEVER be AGI. I actually wonder if the success of (and subsequent obsession with) LLMs is putting true AGI further out of reach. All that these AI companies see are the $$$. When the biggest "AI Research Labs" like OpenAI shifted to product-izing their LLM offerings I think the writing was on the wall that they don't actually care about finding AGI.

replies(3): >>43720042 #>>43720073 #>>43721975 #
thomasahle ◴[] No.43720042[source]
People will keep improving LLMs, and by the time they are AGI (less than 30 years), you will say, "Well, these are no longer LLMs."
replies(6): >>43720091 #>>43720108 #>>43720115 #>>43720202 #>>43720341 #>>43721154 #
__MatrixMan__ ◴[] No.43720115[source]
What the hell is general intelligence anyway? People seem to think it means human-like intelligence, but I can't imagine we have any good reason to believe that our kinds of intelligence constitute all possible kinds of intelligence--which, from the words, must be what "general" intelligence means.

It seems like even if it's possible to achieve GI, artificial or otherwise, you'd never be able to know for sure that thats what you've done. It's not exactly "useful benchmark" material.

replies(3): >>43720234 #>>43720327 #>>43721422 #
1. lupusreal ◴[] No.43721422[source]
The way some people confidently assert that we will never create AGI, I am convinced the term essentially means "machine with a soul" to them. It reeks of religiosity.

I guess if we exclude those, then it just means the computer is really good at doing the kind of things which humans do by thinking. Or maybe it's when the computer is better at it than humans and merely being as good as the average human isn't enough (implying that average humans don't have natural general intelligence? Seems weird.)

replies(1): >>43723570 #
2. cmsj ◴[] No.43723570[source]
When we say "the kind of things which humans do by thinking", we should really consider that in the long arc of history. We've bootstrapped ourselves from figuring out that flint is sharp when it breaks, to being able to do all of the things we do today. There was no external help, no pre-existing dataset trained into our brains, we just observed, experimented, learned and communicated.

That's general intelligence - the ability to explore a system you know nothing about (in our case, physics, chemistry and biology) and then interrogate and exploit it for your own purposes.

LLMs are an incredible human invention, but they aren't anything like what we are. They are born as the most knowledgeable things ever, but they die no smarter.