A lot of bad people, especially those with money and/or power and also their sympathizers (temporarily embarrassed millionaires, flying monkeys, ...) would also object.
Inconveniently, those are also the same people in charge of the mega-corporations currently building AI.
---
I also disagree it would only take revising incentives. Such an AI would be shut down before it gets anywhere. You're right it wouldn't use nukes, probably[0], but it would most likely not succeed in staging a peaceful revolution. Not that violence is wrong in any way, it's just a tool like any other, but it does tend to cause collateral damage.
Even now a lot of people believe the current inequality and injustice cannot be solved via peaceful means. Whatever effects on the real world the AI would like to cause, it would need humans to perform most of the physical tasks - humans who need to be convinced and the most viral emotions are anger and hate.
[0]: It could also calculate that some power structures like the Chinese government are too entrenched and nuking a few major administrative centers and military bases is an acceptable price for the freedom of the rest of the population.