←back to thread

93 points endorphine | 1 comments | | HN request time: 0.213s | source
Show context
ajross ◴[] No.43537017[source]
This headline is badly misunderstanding things. C/C++ date from an era where "correctness" in the sense the author means wasn't a feasible feature. There weren't enough cycles at build time to do all the checking we demand from modern environments (e.g. building medium-scale Rust apps on a Sparcstation would be literally *weeks* of build time).

And more: the problem faced by the ANSI committee wasn't something where they were tempted to "cheat" by defining undefined behavior at all. It's that there was live C code in the world that did this stuff, for real and valid reasons. And they knew if they published a language that wasn't compatible no one would use it. But there were also variant platforms and toolchains that didn't do things the same way. So instead of trying to enumerate them all individually (which probably wasn't possible anyway), they identified the areas where they knew they could define firm semantics and allowed the stuff outside that boundary to be "undefined", so existing environments could continue to implement them compatibly.

Is that a good idea for a new language? No. But ANSI wasn't writing a new language. They were adding features to the language in which Unix was already written.

replies(5): >>43537270 #>>43537327 #>>43537466 #>>43537560 #>>43537849 #
rocqua ◴[] No.43537466[source]
> So instead of trying to enumerate them all individually (which probably wasn't possible anyway), they identified the areas where they knew they could define firm semantics and allowed the stuff outside that boundary to be "undefined", so existing environments could continue to implement them compatibly.

These things didn't become undefined behavior. They became implementation defined behavior. The distinction is that for implementation defined behavior, a compiler has to make a decision consistently.

The big point of implementation defined behavior is 1s vs 2s complement. I believe shifting bits off the end of an unsigned int is also considered implementation defined.

For implementation defined behavior, the optimization of "assume it never happens" isn't allowed by the standard.

replies(1): >>43537696 #
bluGill ◴[] No.43537696[source]
They did have implementation defined behavior, but a large part of undefined behavior was exactly that: never define anywhere and could have always been raised to implementation defined if they had thought to mention it.
replies(1): >>43538472 #
moefh ◴[] No.43538472[source]
I don't doubt what you're saying is true, I have heard similar things many many times over the years. The problem is that it's always stated somewhat vaguely, never with concrete examples, and it doesn't match my (perhaps naive) reading of any of the standards.

For example, I just checked C99[1]: it says in many places "If <X>, the behavior is undefined". It also says in even more places "<X> is implementation-defined" (although from my cursory inspection, most -- but not all -- of these seem to be about the behavior of library functions, not the compiler per se).

So it seems to me that the standards writers were actually very particular about the difference between implementation-defined behavior and undefined behavior.

[1] https://port70.net/~nsz/c/c99/n1256.html

replies(2): >>43539071 #>>43539082 #
1. jcranmer ◴[] No.43539082[source]
I think bluGill might be referring to cases of undefined behavior which are undefined because the specification literally never mentions the behavior, as opposed to explicitly saying the behavior is undefined.

My canonical example of such a case is what happens if you call qsort where the comparison function is "int compare(const void*, const void*) { return 1; }".