←back to thread

611 points LorenDB | 9 comments | | HN request time: 0s | source | bottom
Show context
dvratil ◴[] No.43908097[source]
The one thing that sold me on Rust (going from C++) was that there is a single way errors are propagated: the Result type. No need to bother with exceptions, functions returning bool, functions returning 0 on success, functions returning 0 on error, functions returning -1 on error, functions returning negative errno on error, functions taking optional pointer to bool to indicate error (optionally), functions taking reference to std::error_code to set an error (and having an overload with the same name that throws an exception on error if you forget to pass the std::error_code)...I understand there's 30 years of history, but it still is annoying, that even the standard library is not consistent (or striving for consistency).

Then you top it on with `?` shortcut and the functional interface of Result and suddenly error handling becomes fun and easy to deal with, rather than just "return false" with a "TODO: figure out error handling".

replies(24): >>43908133 #>>43908158 #>>43908212 #>>43908219 #>>43908294 #>>43908381 #>>43908419 #>>43908540 #>>43908623 #>>43908682 #>>43908981 #>>43909007 #>>43909117 #>>43909521 #>>43910388 #>>43912855 #>>43912904 #>>43913484 #>>43913794 #>>43914062 #>>43914514 #>>43917029 #>>43922951 #>>43924618 #
jeroenhd ◴[] No.43908294[source]
The result type does make for some great API design, but SerenityOS shows that this same paradigm also works fine in C++. That includes something similar to the ? operator, though it's closer to a raw function call.

SerenityOS is the first functional OS (as in "boots on actual hardware and has a GUI") I've seen that dares question the 1970s int main() using modern C++ constructs instead, and the API is simply a lot better.

I can imagine someone writing a better standard library for C++ that works a whole lot like Rust's standard library does. Begone with the archaic integer types, make use of the power your language offers!

If we're comparing C++ and Rust, I think the ease of use of enum classes/structs is probably a bigger difference. You can get pretty close, but Rust avoids a lot of boilerplate that makes them quite usable, especially when combined with the match keyword.

I think c++, the language, is ready for the modern world. However, c++, the community, seems to be struck at least 20 years in the past.

replies(5): >>43908844 #>>43909517 #>>43909952 #>>43911784 #>>43913462 #
jll29 ◴[] No.43911784[source]
> I think c++, the language, is ready for the modern world. However, c++, the community, seems to be struck at least 20 years in the past.

Good point. A language that gets updated by adding a lot of features is DIVERGING from a community that has mostly people that still use a lot of the C baggage in C++, and only a few folks that use a lot of template abstraction at the other end of the spectrum.

Since in larger systems, you will want to re-use a lot of code via open source libraries, one is inevitably stuck in not just one past, but several versions of older C++, depending on when the code to be re-used was written, what C++ standard was stable enough then, and whether or not the author adopted what part of it.

Not to speak of paradigm choice to be made (object oriented versus functional versus generic programmic w/ templates).

It's easier to have, like Rust offers it, a single way of doing things properly. (But what I miss in Rust is a single streamlined standard library - organized class library - like Java has had it from early days on, it instead feels like "a pile of crates").

replies(2): >>43912573 #>>43913243 #
pjmlp ◴[] No.43912573[source]
Just give Rust 36 years of field use, to see how it goes.
replies(1): >>43913429 #
timschmidt ◴[] No.43913429[source]
36 years is counting from the first CFront release. Counting the same way for Rust, it's been around since 2006. It's got almost 20 years under it's belt already.

edit: what's with people downvoting a straight fact?

replies(2): >>43913725 #>>43913774 #
1. d_tr ◴[] No.43913774[source]
Rust 0.1, the first public release, came out in January 2012. CFront 1.0, the first commercial release, came out in 1985.

The public existence of Rust is 13 years, during which computing has not changed that much to be honest. Now compare this to the prehistory that is 1985, when CFront came out, already made for backwards compatibility with C.

replies(1): >>43913846 #
2. timschmidt ◴[] No.43913846[source]
I grew up with all the classic 8 bit micros, and to be honest, it doesn't feel like computing has changed at all since 1985. My workstation, while a billion times faster, is still code compatible with a Datapoint 2200 from 1970.

The memory model, interrupt model, packetized networking, digital storage, all function more or less identically.

In embedded, I still see Z80s and M68ks like nothing's changed.

I'd love to see more concrete implementations of adiabatic circuits, weird architectures like the mill, integrated FPGAs, etc. HP's The Machine effort was a rare exciting new thing until they walked back all the exciting parts. CXL seems like about the most interesting new thing in a bit.

replies(2): >>43914172 #>>43914855 #
3. qznc ◴[] No.43914172[source]
Today a byte is 8 bits. That was not always the case back then, for example.
replies(1): >>43914277 #
4. timschmidt ◴[] No.43914277{3}[source]
> I grew up with all the classic 8 bit micros

Meaning that all the machines I've ever cared about have had 8 bit bytes. The TI-99/4A, TRS-80, Commodore 64 and 128, Tandy 1000 8088, Apple ][, Macintosh Classic, etc.

Many were launched in the late 70s. By 1985 we were well into the era of PC compatibles.

replies(1): >>43915190 #
5. mazurnification ◴[] No.43914855[source]
Does GPU thingy count as something that has changed with computing?
replies(2): >>43914910 #>>43916631 #
6. timschmidt ◴[] No.43914910{3}[source]
Yeah, I almost called that out. Probably should have. GPU/NPU feels new (at least for us folks who could never afford a Cray). Probably the biggest change in the last 20 years, especially if you classify it with other multi-core development.
7. bluGill ◴[] No.43915190{4}[source]
in 1985 PC compatibles were talked about, but systems like VAX, and mainframes were still very common and considered the real computers while PCs were toys for executives. PCs had already shown enough value (via word processors and spreadsheets) that everyone knew they were not going away. PCs lacked things like multi-tasking that even then "real" computers had for decades.
replies(1): >>43915274 #
8. timschmidt ◴[] No.43915274{5}[source]
> in 1985 PC compatibles were talked about

My https://en.wikipedia.org/wiki/Tandy_1000 came out in 1984. And it was a relatively late entry to the market, it was near peak 8088 with what was considered high end graphics and sound for the day, far better than the IBM PC which debuted in 1981 and only lasted until 1987.

9. adolph ◴[] No.43916631{3}[source]
It may go on to be as important as the FPU [0]. Amazingly enough you can still get one for a Classic II [1].

0. https://en.wikipedia.org/wiki/Floating-point_unit

1. https://www.tindie.com/products/jurassicomp/68882-fpu-card-f...