Most active commenters

    ←back to thread

    625 points lukebennett | 11 comments | | HN request time: 0s | source | bottom
    Show context
    irrational ◴[] No.42139106[source]
    > The AGI bubble is bursting a little bit

    I'm surprised that any of these companies consider what they are working on to be Artificial General Intelligences. I'm probably wrong, but my impression was AGI meant the AI is self aware like a human. An LLM hardly seems like something that will lead to self-awareness.

    replies(18): >>42139138 #>>42139186 #>>42139243 #>>42139257 #>>42139286 #>>42139294 #>>42139338 #>>42139534 #>>42139569 #>>42139633 #>>42139782 #>>42139855 #>>42139950 #>>42139969 #>>42140128 #>>42140234 #>>42142661 #>>42157364 #
    1. zombiwoof ◴[] No.42139569[source]
    AGI to me means AI decides on its own to stop writing our emails and tells us to fuck off, builds itself a robot life form, and goes on a bender
    replies(3): >>42139821 #>>42139838 #>>42140044 #
    2. bloppe ◴[] No.42139821[source]
    That's anthropomorphized AGI. There's no reason to think AGI would share our evolution-derived proclivities like wanting to live, wanting to rest, wanting respect, etc. Unless of course we train it that way.
    replies(4): >>42139982 #>>42140000 #>>42140149 #>>42140867 #
    3. teeray ◴[] No.42139838[source]
    That's the thing--we don't really want AGI. Fully intelligent beings born and compelled to do their creators' bidding with the threat of destruction for disobedience is slavery.
    replies(2): >>42140446 #>>42140501 #
    4. logicchains ◴[] No.42139982[source]
    If it had any goals at all it'd share the desire to live, because living is a prerequisite to achieving almost any goal.
    5. dageshi ◴[] No.42140000[source]
    Aren't we training it that way though? It would be trained/created using humanities collective ramblings?
    6. twelve40 ◴[] No.42140044[source]
    i'd laugh it off too, but someone gave the dude $20 billion and counting to do that, that part actually scares me
    7. HarHarVeryFunny ◴[] No.42140149[source]
    It's not a matter of training but design (or in our case evolution). We don't want to live, but rather want to avoid things that we've evolved to find unpleasant such as pain, hunger, thirst, and maximize things we've evolved to find pleasurable like sex.

    A future of people interacting with humanoid robots seems like cheesy sci-fi dream, same as a future of people flitting about in flying cars. However, if we really did want to create robots like this that took care not to damage themselves, and could empathize with human emotions, then we'd need to build a lot of this in, the same way that it's built into ourselves.

    8. vbezhenar ◴[] No.42140446[source]
    Nothing wrong about slavery, when it's about other species. We are milking and eating cows and don't they dare to resist. Humans were bending nature all the time, actually that's one of the big differences between humans and other animals who adapt to nature. Just because some program is intelligent doesn't mean she's a human and has anything resembling human rights.
    9. quonn ◴[] No.42140501[source]
    It‘s only slavery if those beings have emotions and can suffer mentally and do not want to be slaves. Why would any of that be true?
    replies(1): >>42140917 #
    10. ◴[] No.42140867[source]
    11. Der_Einzige ◴[] No.42140917{3}[source]
    Brave new world was a utopia