Most active commenters

    ←back to thread

    289 points rntn | 13 comments | | HN request time: 0s | source | bottom
    Show context
    cakealert ◴[] No.44612557[source]
    EU regulations are sometimes able to bully the world into compliance (eg. cookies).

    Usually minorities are able to impose "wins" on a majority when the price of compliance is lower than the price of defiance.

    This is not the case with AI. The stakes are enormous. AI is full steam ahead and no one is getting in the way short of nuclear war.

    replies(2): >>44612773 #>>44613635 #
    1. oaiey ◴[] No.44612773[source]
    But AI also carries tremendous risks, from something simple as automating warfare to something like a evil AGI.

    In Germany we have still traumas from automatic machine guns setup on the wall between East and West Germany. The Ukraine is fighting a drone war in the trenches with a psychological effect on soldiers comparable to WWI.

    Stake are enormous. Not only toward the good. There is enough science fiction written about it. Regulation and laws are necessary!

    replies(4): >>44613225 #>>44614062 #>>44614492 #>>44614965 #
    2. zettabomb ◴[] No.44613225[source]
    I don't disagree that we need regulation, but I also think citing literal fiction isn't a good argument. We're also very, very far away from anything approaching AGI, so the idea of it becoming evil seems a bit far fetched.
    replies(4): >>44613315 #>>44613507 #>>44614507 #>>44614851 #
    3. HighGoldstein ◴[] No.44613315[source]
    Autonomous sentry turrets have already been a thing since the 2000s. If we assume that military technology is always at least some 5-10 years ahead of civilian, it is likely that some if not all of the "defense" contractors have far more terrifying autonomous weapons.
    4. ben_w ◴[] No.44613507[source]
    I agree fiction is a bad argument.

    On the other hand, firstly every single person disagrees what the phrase AGI means, varying from "we've had it for years already" to "the ability to do provably impossible things like solve the halting problem"; and secondly we have a very bad track record for knowing how long it will take to invent anything in the field of AI with both positive and negative failures, for example constantly thinking that self driving cars are just around the corner vs. people saying an AI that could play Go well was "decades" away a mere few months before it beat the world champion.

    5. chii ◴[] No.44614062[source]
    regulation does not stop weapons from being created that utilizes AI. It only slows down honest states that try to abide by it, and gives the dishonest ones a head start.

    Guess what happens to the race then?

    6. tim333 ◴[] No.44614492[source]
    I think your machine gun example illustrates people are quite capable of masacreing each other without AI or even high tech - in past periods sometimes over 30% of males died in warfare. While AI could get involved it's kind of a separate thing.
    replies(1): >>44614626 #
    7. tim333 ◴[] No.44614507[source]
    Did you catch the news about Grok wanting to kill the jews last week? All you need for AI or AGI to be evil is a prompt saying be evil.
    8. FirmwareBurner ◴[] No.44614626[source]
    Yeah, his automated gun phobia argument is dumb. Should we ban all future tech development because some people are a scared of some things that can be dangerous but useful? NO.

    Plus, ironically, Germany's Rheinmetall is a leader in automated anti-air guns so the people's phobia of automated guns is pointless and, at least in this case, common sense won, but in many others like nuclear energy, it lost.

    It seems like Germans area easy to manipulate to get them to go against their best interests, if you manage to trigger some phobias in them via propaganda. "Ohoohoh look out, it's the nuclear boogieman, now switch your economy to Russian gas instead, it's safer"

    replies(1): >>44615696 #
    9. ken47 ◴[] No.44614851[source]
    We don't need AGI in order for AI to destroy humanity.
    10. stainablesteel ◴[] No.44614965[source]
    you can choose to live in fear, the rest of us are embracing growth
    11. 1718627440 ◴[] No.44615696{3}[source]
    The switching to russian gas is bad for know, but was rational back then. The idea was to give russia leverage on europe besides war, so that they don't need war.
    replies(1): >>44616850 #
    12. FirmwareBurner ◴[] No.44616850{4}[source]
    >but was rational back then.

    Only if you're a corrupt German politician getting bribed by Russia to sell out long term national security for short term corporate profits.

    It was also considered a stupid idea back then by NATO powers asking Germany WTF are you doing, tying your economy to the nation we're preparing to go to war with.

    > The idea was to give russia leverage on europe besides war, so that they don't need war.

    The present day proves it was a stupid idea.

    "You were given the choice between war and dishonor. You chose dishonor, and you will have war." - Churchill

    replies(1): >>44616934 #
    13. 1718627440 ◴[] No.44616934{5}[source]
    It worked quite well between France and Germany 50 years earlier.

    Yes it was naive, given the philosophy of the leaders of the UdSSR/Russia, but I don't think it was that much problematic. We do need some years to adapt, but it doesn't meaningfully impact the ability to send weapons to the ukraine and impose sanctions (in the long term). Meanwhile we got cheap gas for some decades and Russia got some other trade partners beside China. Would we better of if we didn't use the oil in the first place? Then Russia would have bounded earlier only to China and Nordkorea, etc. . It also did have less environmental impact then shipping the oil from the US.