←back to thread

324 points rntn | 1 comments | | HN request time: 0.209s | source
Show context
cakealert ◴[] No.44612557[source]
EU regulations are sometimes able to bully the world into compliance (eg. cookies).

Usually minorities are able to impose "wins" on a majority when the price of compliance is lower than the price of defiance.

This is not the case with AI. The stakes are enormous. AI is full steam ahead and no one is getting in the way short of nuclear war.

replies(2): >>44612773 #>>44613635 #
oaiey ◴[] No.44612773[source]
But AI also carries tremendous risks, from something simple as automating warfare to something like a evil AGI.

In Germany we have still traumas from automatic machine guns setup on the wall between East and West Germany. The Ukraine is fighting a drone war in the trenches with a psychological effect on soldiers comparable to WWI.

Stake are enormous. Not only toward the good. There is enough science fiction written about it. Regulation and laws are necessary!

replies(4): >>44613225 #>>44614062 #>>44614492 #>>44614965 #
zettabomb ◴[] No.44613225[source]
I don't disagree that we need regulation, but I also think citing literal fiction isn't a good argument. We're also very, very far away from anything approaching AGI, so the idea of it becoming evil seems a bit far fetched.
replies(4): >>44613315 #>>44613507 #>>44614507 #>>44614851 #
1. HighGoldstein ◴[] No.44613315[source]
Autonomous sentry turrets have already been a thing since the 2000s. If we assume that military technology is always at least some 5-10 years ahead of civilian, it is likely that some if not all of the "defense" contractors have far more terrifying autonomous weapons.