Zero initializing often hides real and serious bugs, however. Say you have a function with an internal variable LEN that ought to get set to some dynamic length that internal operations will run over. Changes to the code introduce a path which skips the setting of LEN. Current compilers will (very likely) warn you about the potentially uninitialized use, valgrind will warn you (assuming the case gets triggered), and failing all that the program will potentially crash when some large value ends up in LEN-- alerting you to the issue.
Compare with default zero init: The compiler won't warn you, valgrind won't warn you, and the program won't crash. It will just be silently wrong in many cases (particularly for length/count variables).
Generally the attention to exploit safety can sometimes push us in directions that are bad for program correctness. There are many places where exploit safety is important, but also many cases where its irrelevant. For security it's generally 'safe' is a program erroneously shuts down or does less than it should but that is far from true for software generally.
I prefer this behavior: Use of an uninitialized variable is an error which the compiler will warn about, however, in code where the compiler cannot prove that it is not used the compiler's behavior is implementation defined and can include trapping on use, initializing to zero, or initializing to ~0 (the complement of zero) or other likely to crash pattern. The developer may annotate with _noinit which makes any use UB and avoids the cost of inserting a trap or ~0 initialization. ~0 init will usually fail but seldom in a silent way, so hopefully at least any user reports will be reproducible.
Similar to RESTRICT _noinit is a potential footgun, but its usage would presumably be quite rare and only in carefully maintained performance critical code. Code using _noinit like RESTRICT is at least still more maintainable than assembly.
This approach preserves the compiler's ability to detect programmer error, and lets the implementation pick the preferred way to handle the remaining error. In some contexts it's preferable to trap cleanly or crash reliably (init to ~0 or explicit trap), in others its better to be silently wrong (init 0).
Since C99 lets you declare variables wherever so it is often easy to just declare a variable where it is first set and that's probably best, of course. .. when you can.