This is one of the nicer ones.
It looks pretty conservative in it's use of Rust's advanced features. The code looks pretty easy to read and follow. There's actually a decent amount of comments (for rust code).
Not bad!
This is one of the nicer ones.
It looks pretty conservative in it's use of Rust's advanced features. The code looks pretty easy to read and follow. There's actually a decent amount of comments (for rust code).
Not bad!
But then there's this Arc, Ref, Pinning and what not - how deep is that rabbit hole?
What tends to make Rust complex is advanced use of traits, generics, iterators, closures, wrapper types, async, error types… You start getting these massive semi-autogenerated nested types, the syntax sugar starts generating complex logic for you in the background that you cannot see but have to keep in mind.
It’s tempting to use the advanced type system to encode and enforce complex API semantics, using Rust almost like a formal verifier / theorem prover. But things can easily become overwhelming down that rabbit hole.
- Overall too complex
- Wrong philosophy: demanding the user to solve problems instead of solving problems for the user
- Trying to provide infinite backwards compatibility with crates, which leads to hidden bitrot
- Slow compilation times
- Claims to be "safe" but allows arbitrary unsafe code, and it's everywhere.
- Adding features to fix misfeatures (e.g. all that lifetime cruft; arc pointers) instead of fixing the underlying problem
- Hiding implementations with leaky abstractions (traits)
- Going at great length to avoid existing solutions so users re-invent it (e.g. OOP with inheritance; GC), or worse, invent more complex paradigms to work around the lack (e.g. some Rust GUI efforts; all those smart pointer types to work around the lack of GC)
- A horrendous convoluted syntax that encourages bad programming style: lot's of unwrap, and_then, etc. that makes programs hard to read and audit.
- Rust's safe code is not safe: "Rust’s safety guarantees do not include a guarantee that destructors will always run. [...] Thus, allowing mem::forget from safe code does not fundamentally change Rust’s safety guarantees."
It already has similar complexity and cognitive demands as C++ and it's going to get worse. IMHO, that's also why it's popular. Programmers love shitty languages that allow them to show off. Boring is good.
Completely subjective. I've learned all there is to learn about Rust's syntax and most of its standard libraries, I think, and it's really not all that, in my personal opinion. There are certainly much more complex languages out there, even dynamic languages. I'd argue Typescript is more complex than Rust as a language.
> Wrong philosophy: demanding the user to solve problems instead of solving problems for the user
I have no idea what you mean by this. Do you mean you want more magic?
> Trying to provide infinite backwards compatibility with crates, which leads to hidden bitrot
Backwards compatibility reduces bitrot. Bitrot is when the ecosystem has moved on to a point of not supporting features used by stale code, thus making the code partially or completely unusable in newer environments as time progresses and the code doesn't update.
The Rust editions explicitly and definitively solve the bitrot problem, so I'm not sure what you're on about here.
> Slow compilation times
Sure, of course. That's really the biggest complaint most people have, though I've had C++ programs take just as long. Really depends on how the code is structured.
> Claims to be "safe" but allows arbitrary unsafe code, and it's everywhere.
Unsafe isn't a license to kill. It also doesn't allow "arbitrary" code. I suggest reading the rustnomicon, the book about Rust undefined behavior. All `unsafe` code must adhere to the postcondition that no undefined behavior is present. It also doesn't remove borrow checking and the like. Without `unsafe` you couldn't do really anything that a systems language would need to do in certain cases - e.g. writing a kernel requires doing inherently unsafe things (e.g. switching out CR3) where no compiler on earth currently written will understand those semantics.
People seem to parrot this same "unsafe nullifies rust's safety" without really understanding it. I suppose they could have renamed the `unsafe` keyword `code_the_does_stuff_unverifiable_by_the_compiler_so_must_still_adhere_to_well_formed_postrequisites_at_risk_of_invoking_undefined_behavior` but alas I think it'd be pretty annoying to write that so often.
It's pretty typical to abstract away `unsafe` code into a safe API, as most crates do.
> Adding features to fix misfeatures (e.g. all that lifetime cruft; arc pointers) instead of fixing the underlying problem
Lifetimes aren't "cruft", not sure what you mean. They've also been elided in a ton of cases.
An "arc pointer" isn't a thing; there's ARC (which is present in every unmanaged language, including C++, Objective-C, Swift, etc). I'm not sure what the "underlying problem" is you're referring to. Rust takes the position that the standard library shouldn't automatically make e.g. Mutexes an atomically reference counted abstraction, but instead allow the user to determine if reference counting if even necessary (Rc<Mutex>) and if it should be atomic so as to be shareable across cores (Arc<Mutex>). This type composure is exactly why Rust's type system is so easy to work with, refactor and optimize.
> Hiding implementations with leaky abstractions (traits)
Sorry for being blunt but this is a word salad. Traits aren't leaky abstractions. In my personal experience they compose so, so much better and have better optimization strategies than more rigid OOP class hierarchies. So I'm not sure what you mean here.
> Going at great length to avoid existing solutions so users re-invent it (e.g. OOP with inheritance; GC), or worse, invent more complex paradigms to work around the lack (e.g. some Rust GUI efforts; all those smart pointer types to work around the lack of GC)
Trait theory has been around for ages. GC is not a silver bullet and I wish people would stop pretending it was. There are endless drawbacks to GC. "All those smart pointer types" -- which ones? You just seem to want GC. I'm not sure why you want GC. GC solves few problems and creates many more. It can't be used in a ton of environments, either.
> A horrendous convoluted syntax that encourages bad programming style: lot's of unwrap, and_then, etc. that makes programs hard to read and audit.
This is completely subjective. And no, there's not a lot of `and_then`, I don't think you've read much Rust. Sorry if I'm sounding rude, but it's clear to me by this point in my response that you've played with the language only at a very surface level and have come to some pretty strong (and wrong) conclusions about it.
If you don't like it, fine, but don't try to assert it as being a bad language and imply something about the people that use it or work on it.
> Rust's safe code is not safe: "Rust’s safety guarantees do not include a guarantee that destructors will always run. [...] Thus, allowing mem::forget from safe code does not fundamentally change Rust’s safety guarantees."
You misunderstand what it's saying there but I'm honestly tired of rehashing stuff that's very easily researched that you seem to not be willing to do.
As long as the Rust fans stick to their favorite language, everybody can be happy.
Sigh. This is not true. Not the first part, and especially not the last part. `Unsafe` doesn't allow arbitrary, unsafe code. It resets the compiler to a level where most manually managed languages are all the time. You still have to uphold all guarantees the compiler provides, just manually. That's why Miri exists.
"Unchecked" or "Unconfirmed" would've perhaps been better choices, but Rust considers all other manual memory and reference management unsafe, so the word stuck.
It doesn't seem you're making an informed statement at all anywhere in this thread, choosing instead to be hung up on semantics rather than the facts plainly laid out for you.
If that makes me an "enthusiast" then so be it.
The only times I‘ve used unsafe code is for FFI and very rarely on bare metal machines.
A common Rust programmer will never use unsafe. They will use safe abstractions by the standard library. There is no need for direct use of unsafe in application code, and only very rarely in library code.
In fact, [1] reports that most unsafe calls in libraries are FFI calls into existing C/C++ code or system calls.
[1]: https://foundation.rust-lang.org/news/unsafe-rust-in-the-wil...
I love C and I have used it for more than a decade, but I wouldn‘t choose it again. The most important thing I save with Rust is time and also my sanity. The very fact that I can trust my code if it compiles and that I don’t have to spend hours in GDB anymore makes it worth my while.
That's a lot of unsafe code for an allegedly safe language. Of course, most of it calls into system libraries. I never claimed or insinuated anything to the contrary (except perhaps in your imagination). But if you compare that to typical Ada code, the latter is much safer. Ada programmers try to do more things in Ada, probably because many of them need to write high integrity software.
Anyway, Rust offers nothing of value for me. It's overengineered and the languages I use are already entirely memory safe. Languages are mere tools, if it suits you well, continue using your Rust. No problem for me. By the way, I welcome when people re-write C++ code in Rust. Rust is certainly better than that, but that's a low-hanging fruit!
Ada is fine, just verbose, kinda fun, no comments about it except that its kinda sad how weak their formal verification is. I prefer Frama-C over it. You can compare Ada and rust but ada is horrible, sincerely horrible at working with ownership. Frama-C can run laps around it as you can verify EVEN arbitrary pointer arithmetic.
Calling rust a horrible abomination is weird. As someone who dabbled in CL for an year, I love the fact that it has proc macros and even tho its harder to use it, i can make my own DSLs and make buildtime compilers!!
That opens up a world of possibilities.We can actually have safer and stricter math libraries! Maybe put us back in era of non-electron applications?
The horrible part might be syntax but eh, its a stupid thing to care about.
It could not verify dynamic allocations thats why it has such a huge toolset for working with static allocations.
Frama-C allows you to program in a safe subset of the unsafe language called C.
And these languages are the backbone of everything where lives are at risk. YOu can have a language that allows both unsafe and safe.
Safety is not binary and our trains run C/C++ [BOTH UNSAFE LANGUAGES]
If Ada was used in domains where rust is used, like desktop applications, servers, high perf stuff, it would also do unsafe stuff you could never verify using spark.
But instead it is used in microcontrollers with runtimes provided by adacore and other vendors. Can you fully know if those pieces of code are 100% verified and safe? the free ones are not. atleast the free x86 one.
How ridiculous. The language you use is not memory safe btw. unchecked_deallocation can be easily used without any pragmas iirc. You need to enable spark_mode which will restrict you to an even smaller subset! You cannot even safely write a doubly linked list in it![you can with great pain in rust] [with less pain in Frama-C] [never tried ats]
Not really. It's mostly a modernized version of Zetalisp. In many cases simpler as that, with some added new stuff (like type declarations).
Well, since Rust is explicitly a system programming language, you would expect it to call into underlying systems more often, hence the use of unsafe.
The difference is this: Like all programming languages, Rust lives close to the metal. The „unsafe“ keyword is merely a marker that a system call might happen here, which might be inherently unsafe (think of C‘s localization methods which are not thread safe).
That‘s it. You can call ADA more safer but it still has to adhere to the underlying complexity of the system it runs on, and upon interaction with it via FFI calls it will be just as unsafe, just without a marker.
The low hanging fruit is exactly what Rust is made for. It‘s explicitly overengineered for that one use case, where GC languages can not be used for whatever reasons. It lives in the twilight zone between a GC and calling alloc/free yourself.
I disagree with people rewriting everything in Rust that could be simpler and better done with Python/Csharp/Go/etc. But if you need to work with manual memory management or concurrency with shared references, Rust is certainly your best bet.
CL is fairly carefully designed with regards to compiling. This is why math functions are not generic for instance. Redefining standard functions is undefined behavior, as a self-modifying code. It omits features that don't integrate well with conventional run time and machine models like continuations. It doesn't even require implementations to optimize tail calls.
I have no idea why ANSI CL has such a large page count. In my mind it's such a small language. I think it could have benefited from an editorial pass to get it down to 600-something pages. But that would have delayed it even longer.
Once the horse escapes the barn it's risky. When you rewrite technical text you can very easily change the meaning of something, or take a particular interpretation where multiple are possible and such.
There were many unhappy, but from very different camps. Some were unhappy (for example people in the small Standard Lisp camp) because Common Lisp was not dynamic enough (it has features of static compilation, no fexprs, ...). Others were unhappy because it was too dynamic and difficult to compile to efficient code on small machines with stock CPUs. Some complained that it was too large and hard to fit onto some of the tiny machines of that time. Others complained that it was too small and lacked critical features from larger Lisp implementations (like stack groups, threads, a fully integrated object system, the first version had no useful error handling, gui functionality, extensible streams, ...).
Many more users/implementors from other Lisp dialects were unhappy, because it was clear that their favorite Lisp dialect would slowly fade away - funding was going away, new users would avoid it, existing users would port their code away, ...
> This is why math functions are not generic for instance
The math functions are generic (they work for several types). But there was no machinery behind that specified. They were not generic in the sense of CLOS generic functions (or similar). Also because with CLtL1 there was no such machinery in the language, but there are (non-extensible) generic numeric functions. CLOS later added a machinery for generic functions, but there was no experience to create optimized&fast code for it. The way of a CLtL1 Lisp implementation for fast numeric functions was to specify types and let a compiler generate type specific (non-generic) code. ANSI CL left the language in that state: the generic numeric functions were not implemented by CLOS, similar to so much in the language specification avoids further integration of CLOS and leaves it to implementations to decide how to implement them: I/O, condition handling, ...
> I have no idea why ANSI CL has such a large page count.
It was supposed to be a language specification for industrial users with detailed infos. There were standard templates how to specify a function, macro, ...
The Scheme reports OTOH were made to have the smallest page count possible with 2 columns of text, leaving out much of the detail of a real language spec. Why? Because it was material for a teaching language and thus was supposed to be read by students learning the language in a semester course at the university. Thus R5RS specified a teaching language, just barely, not as a full application programming language (for example it has zero error handling and basic things were just barely specified in its behavior and implementation).