If you genuinely believe this, why on earth would you work for OpenAI etc even in safety / alignment?
The only response in my view is to ban technology (like in Dune) or engage in acts of terror Unabomber style.
replies(2):
Not far off from the conclusion of others who believe the same wild assumptions. Yudkowsky has suggested using terrorism to stop a hypothetical AGI -- that is, nuclear attacks on datacenters that get too powerful.