←back to thread

125 points todsacerdoti | 1 comments | | HN request time: 0.201s | source
1. avidiax ◴[] No.45040821[source]
> we can estimate whether an optimization is beneficial before implementing it. Coz simulates it by slowing down all other program parts. The part we think might be optimizable stays at the same speed it was before, but is now virtually sped up, which allows us to see whether it gives enough of a benefit to justify a perhaps lengthy optimization project.

Really interesting idea. I suppose the underlying assumption is that speeding up a function might reveal a new performance floor that is much higher than the speedup would suggest. Spending time to halve a function's time only to discover that overall processing time only decreased slightly because another function is now blocking would indeed be quite bad.

Not sure how this handles I/O, or if it is enough to simply delay the results of I/O functions to simulate decreased bandwidth or increased latency.