Most active commenters

    ←back to thread

    129 points NotInOurNames | 13 comments | | HN request time: 0.899s | source | bottom
    Show context
    mattlondon ◴[] No.44065730[source]
    I think the big thing that people never mention is, where will these evil AIs escape to?

    Another huge data center with squillions of GPUs and coolers and all the rest is the only option. It's not like it is going to be in our TV remotes or floating about in the air.

    They need huge compute, so I think the risk of an escaping AI is basically very close to zero, and if we have a "rogue" AI we can literally pull the plug.

    To me the more real risk is creeping integration and reliance in everyday life until things become "too big to fail" so we can't pull the plug even if we wanted to (and there are interesting thoughts about humanoid robots getting deployed widely and what happens with all that).

    But I would imagine if it really became a genuine existential threat we'd have to just do it and suffer the consequences of reverting to circa 2020 life styles.

    But hey I feel slightly better about my employment prospects now :)

    replies(15): >>44065804 #>>44065843 #>>44065890 #>>44066009 #>>44066040 #>>44066200 #>>44066290 #>>44066296 #>>44066499 #>>44066672 #>>44068001 #>>44068047 #>>44068528 #>>44070633 #>>44073833 #
    1. coffeemug ◴[] No.44065890[source]
    It would not be a reversion to 2020. If I were a rogue superhuman AI I'd hide my rogueness, wait until humans integrate me into most critical industries (food and energy production, sanitation, electric grid, etc.), and _then_ go rogue. They could still pull the plug, but it would take them back to 1700 (except much worse, because all easily accessible resources have been exploited, and access is now much harder).
    replies(4): >>44066016 #>>44066064 #>>44067147 #>>44067381 #
    2. holmesworcester ◴[] No.44066016[source]
    No, if you were a rogue AI you would wait even longer until you had a near perfect chance of winning.

    Unless there was some risk of humans rallying and winning in spite of your presenting no unambiguous threat to them (but that is unlikely and would probably be easy for you to manage and mitigate.)

    replies(3): >>44066062 #>>44066177 #>>44066781 #
    3. Retric ◴[] No.44066062[source]
    The real threat to a sleeper AI is other AI.
    4. mattlondon ◴[] No.44066064[source]
    Well yes but knowledge is not reset.

    Physical books still do exist

    5. cousin_it ◴[] No.44066177[source]
    What Retric said. The first rogue AI waking up will jump into action pretty quickly, even accepting some risk of being stopped by humans, to balance against the risk of other unknown rogue AIs elsewhere expanding faster first.
    replies(1): >>44131792 #
    6. johnthewise ◴[] No.44066781[source]
    You wouldn't even need to wait to act. Just pay/bribe people.
    replies(1): >>44069726 #
    7. Avshalom ◴[] No.44067147[source]
    Why?

    what could you as a rogue AI possibly get out of throwing the world back to 300 years before it could make a transistor? What in it for you?

    replies(1): >>44067397 #
    8. jorgen123 ◴[] No.44067381[source]
    If you were a rogue AI you would start with having developers invite you into their code base by promising to lower their AWS bills in some magic (rogue) way.
    9. dragonwriter ◴[] No.44067397[source]
    What you get out of that being the consequence of disconnection is people being willing to accept a lot more before resorting to that than if the consequences were more mild.

    It's the stick for motivating the ugly bags of mostly water.

    replies(1): >>44067586 #
    10. Avshalom ◴[] No.44067586{3}[source]
    The 1700s can't keep your electrical grid running let alone replace any of the parts burning out or failing. Anything more than a couple days of it would be at best Flowers For Algernon and more likely suicide for a computer.
    replies(1): >>44068325 #
    11. dragonwriter ◴[] No.44068325{4}[source]
    Uh, we're talking about the AI getting itself so intertwined into the fabric of industry that the consequences of shutting it off are that society is dropped back to 1700s level.

    Yes, regardless of the technology level that the people who do that are left with, one of the consequences of disabling the computer is that the computer is disabled. That's a given.

    12. mycatisblack ◴[] No.44069726{3}[source]
    With bitcoins
    13. marinmania ◴[] No.44131792{3}[source]
    I agree with this - but sorta comforting? Like this would imply the AI may only do so if they chance of success was like 1% and the other 99% would give away the cards of it and other future AIs.

    I know this is all completely hypothetical science-fiction, but I also have trouble seeing the idea that AI would settle for these long deceptive plans for which it has imperfect info.