It’s a brilliant illusion that works… until you hit aliasing, memory models, or concurrency, and suddenly the beautiful DAG collapses into a pile of phi nodes and load/store hell.
It’s a brilliant illusion that works… until you hit aliasing, memory models, or concurrency, and suddenly the beautiful DAG collapses into a pile of phi nodes and load/store hell.
As some evidence to the second point: Haskell is a language that does enforce immutability, but it's compiler, GHC, does not use SSA for main IR -- it uses a "spineless tagless g-machine" graph representation that does, in fact, rely on that immutability. SSA only happens later once it's lowered to a mutating form. If your variables aren't mutated, then you don't even need to transform them to SSA!
Of course, you're welcome to try something else, people certainly have -- take a look at how V8's move to Sea-of-Nodes has gone for them.
The author of Sea-of-Nodes approach is quite critic of V8's decision, as one would expect.