The problem with building AI weapons is that eventually it will be in the hands of people who are morally bankrupt and therefore will use them to do evil.
In my garage, I have some pretty nasty "weapons" - notably a couple of chainsaws, some drills, chisels, lump/sledge/etc hammers and a fencing maul! The rest are merely: mildly malevolent.
You don't need an AI (whatever that means) to get medieval on someone. On the bright side the current state of AI (whatever that means) is largely bollocks.
Sadly, LLMs have and will be wired up to drones and the results will be unpredictable.
A car is a tool. It can be used as a weapon.
Even water and air can be used as a weapon if you try hard enough. There is probably nothing on this planet that couldn't be used as a weapon.
That said, I do not think AI weapons are a reasonable thing to build for any war, for any country, for any reason - even if the enemy has them.
How would we go about doing that?
Every kind of nefarious way to keep the truth at bay in authoritarian regimes is always on the table. From the cracking of iPhones to track journalists covering these regimes, to snooping on email, to using AI to do this? Is just all the same thing, just updated and improved tools.
Just like Kevin Mitnick selling zero day exploits to the highest bidder, I have a hard time seeing how these get developed and somehow stay out of reach of the regimes you speak of.
So you're in favor of losing a war and becoming a subject of the enemy? While it's certainly tempting to think that unilateralism can work, I can hardly see how.
When China attacks with AI weapons do you expect the free world to fight back armed with moral superiority? No. We need even more lethal AI weapons.
Mutual assured destruction has worked so far for nukes.
I never said that. Please don't reply to comments you made up in your head.
Using AI doesn't automagically equate to winning a war. Using AI could mean the AI kills all your own soldiers by mistake. AI is stupid, it just is. It "hallucinates" and often leads to wrong outcomes. And it has never won a war, and there's no guarantee that it would help to win any war.
My point though is that this is the only use case for such systems. The common comparisons to things like knives are invalid for this reason.