←back to thread

324 points rntn | 1 comments | | HN request time: 0.21s | source
Show context
cakealert ◴[] No.44612557[source]
EU regulations are sometimes able to bully the world into compliance (eg. cookies).

Usually minorities are able to impose "wins" on a majority when the price of compliance is lower than the price of defiance.

This is not the case with AI. The stakes are enormous. AI is full steam ahead and no one is getting in the way short of nuclear war.

replies(2): >>44612773 #>>44613635 #
oaiey ◴[] No.44612773[source]
But AI also carries tremendous risks, from something simple as automating warfare to something like a evil AGI.

In Germany we have still traumas from automatic machine guns setup on the wall between East and West Germany. The Ukraine is fighting a drone war in the trenches with a psychological effect on soldiers comparable to WWI.

Stake are enormous. Not only toward the good. There is enough science fiction written about it. Regulation and laws are necessary!

replies(4): >>44613225 #>>44614062 #>>44614492 #>>44614965 #
zettabomb ◴[] No.44613225[source]
I don't disagree that we need regulation, but I also think citing literal fiction isn't a good argument. We're also very, very far away from anything approaching AGI, so the idea of it becoming evil seems a bit far fetched.
replies(4): >>44613315 #>>44613507 #>>44614507 #>>44614851 #
1. tim333 ◴[] No.44614507[source]
Did you catch the news about Grok wanting to kill the jews last week? All you need for AI or AGI to be evil is a prompt saying be evil.