←back to thread

317 points thunderbong | 5 comments | | HN request time: 0.621s | source
Show context
kibwen ◴[] No.42203962[source]
Hyrum's Law is one of those observations that's certainly useful, but be careful not to fixate on it and draw the wrong conclusions. Consider that even the total runtime of a function is an observable property, which means that optimizing a function to make it faster is a breaking change (what if suddenly one of your queues clears too fast and triggers a deadlock??), despite the fact that 99.99999999% of your users would probably appreciate having code that runs faster for no effort on their part.

Therefore it's unavoidable that what constitutes a "breaking change" is a social contract, not a technical contract, because the alternative is that literally nothing is ever allowed to change. So as a library author, document what parts of your API are guaranteed not to change, be reasonable, and have empathy for your users. And as a library consumer, understand that making undocumented interfaces into load-bearing constructs is done at your own risk, and have empathy for the library authors.

replies(9): >>42204203 #>>42204851 #>>42205110 #>>42205152 #>>42206063 #>>42206661 #>>42207782 #>>42209267 #>>42216851 #
1. ljm ◴[] No.42206063[source]
I feel like this is approaching absurdity, if only because something like the total runtime of a function is not under the control of the author of the function. The operating environment will have an impact on that, for example, as will the load the system is currently experiencing. A GC pass can affect it.

In short, I wouldn't consider emergent behaviours of a machine as part of an intentional interface or any kind of contract and therefore I wouldn't see it as a breaking change, the same as fixing a subtle bug in a function wouldn't be seen as a breaking change even if someone depended on the unintentional behaviour.

I think it's more of a testament to Go's hardcore commitment to backwards compatibility, in this case, than anything else.

replies(2): >>42206363 #>>42206407 #
2. skybrian ◴[] No.42206363[source]
Yes, it’s an absurd example to make a point. We don’t normally consider performance in scope for what’s considered a breaking API change and there are good reasons for that, including being non-portable. Performance guarantees are what hard real-time systems do and they’re hardware-specific. (There is also the “gas fee” system that Ethereum has, to make a performance limit consistent across architectures.)

But there are still informal limits. If the performance impact is bad enough, (say, 5x slower, or changing a linear algorithm to quadratic), it’s probably going to be reverted anyway. We just don‘t have any practical way of formalizing rough performance guarantees at an API boundary.

replies(1): >>42206959 #
3. AlotOfReading ◴[] No.42206407[source]
It's quite common in cryptography for the runtime to be important. For example, password verification time shouldn't depend on the value of the key or the password. Systems have been broken because someone wrote a string compare that returned early.
replies(1): >>42207537 #
4. kibwen ◴[] No.42206959[source]
> If the performance impact is bad enough

Even worse, it's possible to select a new algorithm that improves the best-case and average-case runtimes while degrading the worst-case runtime, so no matter what you do it will punish some users and reward others.

5. ljm ◴[] No.42207537[source]
And, since most languages short circuit on basic string comparisons, you'd have some form of `secure_compare` function that compares two strings in constant time, and that behaviour is contracted in the name of the function.

Nobody is rewriting `==` to compare strings in constant time, not because it breaks some kind of API contract, but because it would result in a massive waste of CPU time. The point is, though, that they could. But then they are deciding to sacrifice performance for this one problem.

Crypto is obviously a case of it own when it comes to optimisations and as much as I called out the parent for approaching the absurd, we can pull out many similar special cases of our own.