←back to thread

265 points ctoth | 1 comments | | HN request time: 0.205s | source
Show context
mellosouls ◴[] No.43745240[source]
The capabilities of AI post gpt3 have become extraordinary and clearly in many cases superhuman.

However (as the article admits) there is still no general agreement of what AGI is, or how we (or even if we can) get there from here.

What there is is a growing and often naïve excitement that anticipates it as coming into view, and unfortunately that will be accompanied by the hype-merchants desperate to be first to "call it".

This article seems reasonable in some ways but unfortunately falls into the latter category with its title and sloganeering.

"AGI" in the title of any article should be seen as a cautionary flag. On HN - if anywhere - we need to be on the alert for this.

replies(13): >>43745398 #>>43745959 #>>43746159 #>>43746204 #>>43746319 #>>43746355 #>>43746427 #>>43746447 #>>43746522 #>>43746657 #>>43746801 #>>43749837 #>>43795216 #
jjeaff ◴[] No.43745959[source]
I suspect AGI will be one of those things that you can't describe it exactly, but you'll know it when you see it.
replies(7): >>43746043 #>>43746058 #>>43746080 #>>43746093 #>>43746651 #>>43746728 #>>43746951 #
ninetyninenine ◴[] No.43746043[source]
I suspect everyone will call it a stochastic parrot because it did this one thing not right. And this will continue into the far far future even when it becomes sentient we will completely miss it.
replies(2): >>43746150 #>>43746235 #
AstralStorm ◴[] No.43746235[source]
It's more than that but less than intelligence.

Its generalization capabilities are a bit on the low side, and memory is relatively bad. But it is much more than just a parrot now, it can handle some of basic logic, but not follow given patterns correctly for novel problems.

I'd liken it to something like a bird, extremely good at specialized tasks but failing a lot of common ones unless repeatedly shown the solution. It's not a corvid or a parrot yet. Fails rather badly at detour tests.

It might be sentient already though. Someone needs to run a test if it can discern itself and another instance of itself in its own work.

replies(2): >>43746293 #>>43747706 #
Jensson ◴[] No.43746293[source]
> It might be sentient already though. Someone needs to run a test if it can discern itself and another instance of itself in its own work.

It doesn't have any memory, how could it tell itself from a clone of itself?

replies(3): >>43746314 #>>43752404 #>>43752923 #
1. butlike ◴[] No.43752404[source]
It doesn't have any memory _you're aware of_. A semiconductor can hold state, so it has memory.