A special carve out for anonymous apps only for people with government connections doesn't help because it fingerprints the operative.
Tor was originally a deniable communications tool.
And why is a phone different from a computer? Nobody bats an eye when downloading program on computer, or visiting a website with arbitrary code.
The example was recent and very clearly put the developer at personal risk. But there are many gray-zones.
An app to decode car diagnostics codes isn't unlawful, but being personally identified could get you in alot of trouble by car companies anyway.
And what about making an independent news app in Russia? More clearly ok by our morals and law, but very dangerous for the developer.
It's also really stupid to drive a car in a flood, but we don't have cars check the weather forecast before starting up( maybe I shouldn't post this, might give someone some ideas).
Heck, even one of Google's apps tracks speeding camera locations and police: https://play.google.com/store/apps/details?id=com.waze&hl=en
This is fair to make and a better strategy. My comment was about calling out the messaging or strategy that the author went with. Choosing an idea of people should be able to decide the apps they are able to run is more agreeable of a message than a message of "think about developers who want to avoid consequences of breaking the law." The latter is an antisocial message that I do not think is as agreeable.
>We all agree that some amount of criminal activity must be tolerated
I don't agree with that. Especially with the upcoming rise of AI law enforcement. Much more freedom than a cell can be given while maintaining effective law enforcement.