Of course, the article doesn't mention lambdas.
Why go through all the trouble to make a better array, and require the user to call a special .at() function to get range checking rather than the other way around? I promptly went into my standard library and reversed that decision because if i'm going to the trouble to use a C++ array class, it better damn well give me a tiny bit of additional protection. The .at() call should have been the version that reverted to C array behavior without the bounds checking.
And its these kinds of decisions repeated over and over. I get its a committee. Some of the decisions won't be the best, but by 2011 everyone had already been complaining about memory safety issues for 15+ years and there wasn't enough politics on the comittee to recognize that a big reason for using C++ over C was the ability of the language to protect some of the sharper edges of C?
[1] Not me making this up - I started getting into guns and this is what people say.
Let no one accuse the committee of being unresponsive.
Because the point was not to make an array type that's safe by default, but rather to make an array type that behaves like an object, and can be returned, copied, etc. I mean, I agree with you, I think operator[]() should range-check by default, but you're simply misunderstanding the rationale for the class.
In a complete tangent I think that "smart guns" that only let you shoot bullseye targets, animals and designated un-persons are not far off.
And my point in providing a concrete example, where a decision was made to prioritize unsafe behavior in a known problematic area, when they could just as well have made a half dozen other decisions which would have solved a long standing problem rather than just perpetuating it with some new syntactic sugar.
That said, making std::array::operator[]() range-checking would have been worse, because it would have been the only overload that did that. Could they have, in the same version, made all the overloads range-checking? Maybe, I don't know.