Probably a big reason for him seeing slowdowns in incremental builds with MSVC is because of link-time code generation. What seems to be link time is actually code generation time, and it's delayed because intra-procedural optimizations can be run. This kills off a lot of the benefit of incremental building - you're basically only saving parsing and type analysis - and redoing a lot of code generation work for every modification.
NTFS also fragments very badly when free space is fragmented. If you don't liberally use SetFilePointer / SetEndOfFile, it's very common to see large files created incrementally to have thousands, or tens of thousands, of fragments. Lookup (rather than listing) on massive directories can be fairly good though - btrees are used behind the scenes - presuming that the backing storage is not fragmented, again not a trivial assumption without continuously running a semi-decent defragmenter, like Diskeeper.