It’s a brilliant illusion that works… until you hit aliasing, memory models, or concurrency, and suddenly the beautiful DAG collapses into a pile of phi nodes and load/store hell.
It’s a brilliant illusion that works… until you hit aliasing, memory models, or concurrency, and suddenly the beautiful DAG collapses into a pile of phi nodes and load/store hell.
As some evidence to the second point: Haskell is a language that does enforce immutability, but it's compiler, GHC, does not use SSA for main IR -- it uses a "spineless tagless g-machine" graph representation that does, in fact, rely on that immutability. SSA only happens later once it's lowered to a mutating form. If your variables aren't mutated, then you don't even need to transform them to SSA!
Of course, you're welcome to try something else, people certainly have -- take a look at how V8's move to Sea-of-Nodes has gone for them.
Of course, LuaJIT is cheating, because compared to most compilers it has redefined the problem to handling exactly two control-flow graphs (a line and a line followed by a loop), so most of the usual awkward parts of SSA simply do not apply. But isn’t creatively redefining the problem the software engineer’s main tool?..