←back to thread

322 points atomroflbomber | 1 comments | | HN request time: 0.216s | source
Show context
lelag ◴[] No.36983601[source]
If 2023 ends up giving us AGI, room-temperature superconductors, Starships and a cure for cancer, I think we will able to call it a good year...
replies(10): >>36983623 #>>36984116 #>>36984118 #>>36984549 #>>36986942 #>>36987008 #>>36987250 #>>36987546 #>>36987577 #>>36992261 #
azinman2 ◴[] No.36986942[source]
We’re not getting AGI anytime soon…
replies(6): >>36987177 #>>36987360 #>>36987472 #>>36987477 #>>36987541 #>>36987759 #
AbrahamParangi ◴[] No.36987177[source]
What exactly is your definition of a AGI? Because we’re already passing the Turing test, and so I have to wonder if this isn’t just moving the goalposts.
replies(5): >>36987466 #>>36987583 #>>36988222 #>>36988633 #>>36989206 #
emmanueloga_ ◴[] No.36988222[source]
Self consciousness. Human-level of reasoning. Feelings, etc.

We are NOT close to AGI.

* Fancy Markov chain (LLM) is not AGI.

* Stable diffusion style of image generation is NOT AGI.

* Fancy computer vision is NOT AGI.

Honestly, I don't think we are any closer to AGI. What we are seeing is the peak of "fancy tricks" for computer generated artifacts.

replies(3): >>36988484 #>>36989444 #>>36989557 #
1. khazhoux ◴[] No.36989557[source]
I would say that self-conciousness and feelings are not requirements for AGI. But reasoning certainly is.