In plainer language, I'd say the observation/motivation is that not only do compiling and linking benefit from incrementality/caching/parallelism, but so does the build system itself. That is, the parsing of the build config, and the transformation of the high level target graph to the low level action graph.
So you can implement the build system itself on top of an incremental computation engine.
Also the way I think about the additional dependencies for monadic build systems is basically #include scanning. It's common to complain that Bazel forces you to duplicate dependency info in BUILD files. This info is already present (in some possibly sloppy form) in header files.
So maybe they can allow execution of the preprocessor to feed back into the shape of the target graph or action graph. Although I wonder effect that has on performance.
---
The point about Java vs. Rust is interesting too -- Java doesn't have async/await, or coroutines.
I would have thought you give up some control over when things run with with async/await, but maybe not... I'd like to see how they schedule the tasks.
Implementing Applicative Build Systems Monadically
https://ndmitchell.com/downloads/paper-implementing_applicat...