I understand the attitude but I think it misses a few aspects.
We have far more isolation between software, we have cryptography that would have been impractical to compute decades ago, and it’s used at rest and on the wire. All that comes at significant cost. It might only be a few percent of performance on modern systems, and therefore easy to justify, but it would have been a higher percentage a few decades ago.
Another thing that’s not considered is the scale of data. Yes software is slower, but it’s processing more data. A video file now might be 4K, where decades ago it may have been 240p. It’s probably also far more compressed today to ensure that the file size growth wasn’t entirely linear. The simple act of replaying a video takes far more processing than it did before.
Lastly, the focus on dynamic languages is often either misinformed or purposefully misleading. LLM training is often done in Python and it’s some of the most performance sensitive work being done at the moment. Of course that’s because the actual training isn’t executing in a Python VM. The same is true for so much of “dynamic languages” though, the heavy lifting is done elsewhere and the actual performance benefits of rewriting the Python bit to C++ or something would often be minimal. This does vary of course, but it’s not something I see acknowledged in these overly simplified arguments.
Requirements have changed, software has to do far more, and we’re kidding ourselves if we think it’s comparable. That’s not to say we shouldn’t reduce wastage, we should! But to dismiss modern software engineering because of dynamic languages etc is naive.