←back to thread

364 points Klasiaster | 2 comments | | HN request time: 0.471s | source
Show context
akira2501 ◴[] No.41851912[source]
I personally dislike rust, but I love kernels, and so I'll always check these projects out.

This is one of the nicer ones.

It looks pretty conservative in it's use of Rust's advanced features. The code looks pretty easy to read and follow. There's actually a decent amount of comments (for rust code).

Not bad!

replies(2): >>41852794 #>>41855386 #
wg0 ◴[] No.41855386[source]
Otherwise is a decent language but what makes it difficult is the borrow semantics and lifetimes. Lifetimes are more complicated to get your head around.

But then there's this Arc, Ref, Pinning and what not - how deep is that rabbit hole?

replies(5): >>41855987 #>>41855995 #>>41856204 #>>41856306 #>>41856588 #
oersted ◴[] No.41856306[source]
I don’t entirely agree, you can get used to the borrow checker relatively quickly and you mostly stop thinking about it.

What tends to make Rust complex is advanced use of traits, generics, iterators, closures, wrapper types, async, error types… You start getting these massive semi-autogenerated nested types, the syntax sugar starts generating complex logic for you in the background that you cannot see but have to keep in mind.

It’s tempting to use the advanced type system to encode and enforce complex API semantics, using Rust almost like a formal verifier / theorem prover. But things can easily become overwhelming down that rabbit hole.

replies(1): >>41859789 #
jonathanstrange ◴[] No.41859789[source]
It's just overengineered. Many Rust folks don't realize it because they come from C++ and suffer from Stockholm Syndrome.
replies(1): >>41869044 #
junon ◴[] No.41869044[source]
How is it overengineered?
replies(1): >>41877577 #
jonathanstrange ◴[] No.41877577[source]
That's my personal opinion after I've learned it and read Klabnik's book. I'm aware that other people's mileage differs. I'm listing a few reasons below.

- Overall too complex

- Wrong philosophy: demanding the user to solve problems instead of solving problems for the user

- Trying to provide infinite backwards compatibility with crates, which leads to hidden bitrot

- Slow compilation times

- Claims to be "safe" but allows arbitrary unsafe code, and it's everywhere.

- Adding features to fix misfeatures (e.g. all that lifetime cruft; arc pointers) instead of fixing the underlying problem

- Hiding implementations with leaky abstractions (traits)

- Going at great length to avoid existing solutions so users re-invent it (e.g. OOP with inheritance; GC), or worse, invent more complex paradigms to work around the lack (e.g. some Rust GUI efforts; all those smart pointer types to work around the lack of GC)

- A horrendous convoluted syntax that encourages bad programming style: lot's of unwrap, and_then, etc. that makes programs hard to read and audit.

- Rust's safe code is not safe: "Rust’s safety guarantees do not include a guarantee that destructors will always run. [...] Thus, allowing mem::forget from safe code does not fundamentally change Rust’s safety guarantees."

It already has similar complexity and cognitive demands as C++ and it's going to get worse. IMHO, that's also why it's popular. Programmers love shitty languages that allow them to show off. Boring is good.

replies(2): >>41883465 #>>41904646 #
j-krieger ◴[] No.41904646[source]
> Claims to be "safe" but allows arbitrary unsafe code, and it's everywhere.

Sigh. This is not true. Not the first part, and especially not the last part. `Unsafe` doesn't allow arbitrary, unsafe code. It resets the compiler to a level where most manually managed languages are all the time. You still have to uphold all guarantees the compiler provides, just manually. That's why Miri exists.

replies(2): >>41908829 #>>41908855 #
jonathanstrange ◴[] No.41908855[source]
Either it's safe or it's unsafe. If you use the keyword "unsafe" it should definitely not mean "safe" (and it doesn't, but you seem to suggest it).
replies(3): >>41915086 #>>41915710 #>>42008676 #
junon ◴[] No.41915086[source]
I think you're intentionally misreading everything people are saying to you.
replies(1): >>41916017 #
jonathanstrange ◴[] No.41916017[source]
It's really just you and another Rust fan, there's no need to further discuss this among the three of us. I think I've made it extensively clear - based on the above reasons - that I believe it's a horrible programming language and people using it now will regret it in 10 years or so.
replies(3): >>41917931 #>>41921899 #>>42008790 #
junon ◴[] No.41917931[source]
You're welcome to read the rustnomicon to learn about the topic you're discussing. Having written C and C++ for almost 15 years and doing extensive embedded work with it, I'm very secure in my decision to use Rust. But I'm capable of doing research to learn about it and to be somewhat involved in the development, mostly as an observer, to see both the direction it's moving and the overall process and meticulousness with which it's developed, to make an informed decision.

It doesn't seem you're making an informed statement at all anywhere in this thread, choosing instead to be hung up on semantics rather than the facts plainly laid out for you.

If that makes me an "enthusiast" then so be it.

replies(1): >>42008455 #
jonathanstrange ◴[] No.42008455[source]
Well, if you come from C++ then Rust might look nice to you. But I come from languages like CommonLisp and Ada, and so Rust just looks like a horrible abomination to me because that's what it is. It's also not surprising. A good programming language simply cannot be designed that fast.
replies(1): >>42008583 #
ArtixFox ◴[] No.42008583[source]
Common Lisp is an amalgamation of every lisp they could find, they slammed it all in. Calling it well designed is funny because every single CL developer openly accepts that its a fucking weird language with hell lot of warts that cannot be polished away.

Ada is fine, just verbose, kinda fun, no comments about it except that its kinda sad how weak their formal verification is. I prefer Frama-C over it. You can compare Ada and rust but ada is horrible, sincerely horrible at working with ownership. Frama-C can run laps around it as you can verify EVEN arbitrary pointer arithmetic.

Calling rust a horrible abomination is weird. As someone who dabbled in CL for an year, I love the fact that it has proc macros and even tho its harder to use it, i can make my own DSLs and make buildtime compilers!!

That opens up a world of possibilities.We can actually have safer and stricter math libraries! Maybe put us back in era of non-electron applications?

The horrible part might be syntax but eh, its a stupid thing to care about.

replies(3): >>42009058 #>>42009697 #>>42019160 #
1. kazinator ◴[] No.42019160[source]
CK standardization was before my Lisp time, but from the historical accounts that I read about, many people in the Lisp world were unhappy because Common Lisp didn't have this or that from whatever they were working on, and because CL was standard they would have to use it.

CL is fairly carefully designed with regards to compiling. This is why math functions are not generic for instance. Redefining standard functions is undefined behavior, as a self-modifying code. It omits features that don't integrate well with conventional run time and machine models like continuations. It doesn't even require implementations to optimize tail calls.

I have no idea why ANSI CL has such a large page count. In my mind it's such a small language. I think it could have benefited from an editorial pass to get it down to 600-something pages. But that would have delayed it even longer.

Once the horse escapes the barn it's risky. When you rewrite technical text you can very easily change the meaning of something, or take a particular interpretation where multiple are possible and such.

replies(1): >>42021922 #
2. lispm ◴[] No.42021922[source]
> many people in the Lisp world were unhappy because Common Lisp didn't have this or that from whatever they were working on, and because CL was standard they would have to use it.

There were many unhappy, but from very different camps. Some were unhappy (for example people in the small Standard Lisp camp) because Common Lisp was not dynamic enough (it has features of static compilation, no fexprs, ...). Others were unhappy because it was too dynamic and difficult to compile to efficient code on small machines with stock CPUs. Some complained that it was too large and hard to fit onto some of the tiny machines of that time. Others complained that it was too small and lacked critical features from larger Lisp implementations (like stack groups, threads, a fully integrated object system, the first version had no useful error handling, gui functionality, extensible streams, ...).

Many more users/implementors from other Lisp dialects were unhappy, because it was clear that their favorite Lisp dialect would slowly fade away - funding was going away, new users would avoid it, existing users would port their code away, ...

> This is why math functions are not generic for instance

The math functions are generic (they work for several types). But there was no machinery behind that specified. They were not generic in the sense of CLOS generic functions (or similar). Also because with CLtL1 there was no such machinery in the language, but there are (non-extensible) generic numeric functions. CLOS later added a machinery for generic functions, but there was no experience to create optimized&fast code for it. The way of a CLtL1 Lisp implementation for fast numeric functions was to specify types and let a compiler generate type specific (non-generic) code. ANSI CL left the language in that state: the generic numeric functions were not implemented by CLOS, similar to so much in the language specification avoids further integration of CLOS and leaves it to implementations to decide how to implement them: I/O, condition handling, ...

> I have no idea why ANSI CL has such a large page count.

It was supposed to be a language specification for industrial users with detailed infos. There were standard templates how to specify a function, macro, ...

The Scheme reports OTOH were made to have the smallest page count possible with 2 columns of text, leaving out much of the detail of a real language spec. Why? Because it was material for a teaching language and thus was supposed to be read by students learning the language in a semester course at the university. Thus R5RS specified a teaching language, just barely, not as a full application programming language (for example it has zero error handling and basic things were just barely specified in its behavior and implementation).