That said I think the final takeaway is that systems that allow you to pin versions, vendor all those dependencies and resolve/reproduce the same file tree regardless of who's machine it's on (let's assume matching architectures for simplicity here) is the goal.
Note that removing 'manually' here, this still works:
> Copying and vendoring each package {manually}, and fixing the specific versions down is the most practical approach to keeping a code-base stable, reliable, and maintainable.
The article's emphasis on the manual aspect of management of dependencies is a bit of loss, as I don't particularly believe it _has to be manual_ in the sense of manually copying files from their origin into your file tree; that certainly is a real world option, but few (myself included) would take that monk-like path again. I left this exact situation in C land and would not consider going back unless adopting something like ninja.
What the OP is actually describing is a "good" package manager feature set and many (sadly not most/all) do support this exact feature set today
PS I did chuckle when they defined evil in terms of something that gets you to dependency hell faster. However, we shouldn't be advocating for committing the same sins of our fathers.