←back to thread

517 points bkolobara | 2 comments | | HN request time: 0.552s | source
Show context
BinaryIgor ◴[] No.45042483[source]
Don't most of the benefits just come down to using a statically typed and thus compiled language? Be it Java, Go or C++; TypeScript is trickier, because it compiles to JavaScript and inherits some issues, but it's still fine.

I know that Rust provides some additional compile-time checks because of its stricter type system, but it doesn't come for free - it's harder to learn and arguably to read

replies(17): >>45042692 #>>45043045 #>>45043105 #>>45043148 #>>45043241 #>>45043589 #>>45044559 #>>45045202 #>>45045331 #>>45046496 #>>45047159 #>>45047203 #>>45047415 #>>45048640 #>>45048825 #>>45049254 #>>45050991 #
arwhatever ◴[] No.45043148[source]
I might suspect that if you are lumping all statically-typed languages into a single bucket without making particular distinction among them, then you might not have fully internalized the implications of union (aka Rust enum aka sum) typed data structures combined with exhaustive pattern matching.

I like to call it getting "union-pilled" and it's really hard to accept otherwise statically-typed languages once you become familiar.

replies(3): >>45043455 #>>45043677 #>>45044134 #
ModernMech ◴[] No.45043677[source]
enums + match expressions + tagged unions are the secret sauce of Rust.
replies(2): >>45047622 #>>45050213 #
pjmlp ◴[] No.45050213[source]
Like this code snippet?

    (* Expressions *)

    type Exp = 
          UnMinus of Exp
        | Plus of Exp * Exp
        | Minus of Exp * Exp
        | Times of Exp * Exp
        | Divides of Exp * Exp
        | Power of Exp * Exp
        | Real of float 
        | Var of string
        | FunCall of string * Exp
        | Fix of string * Exp
        ;;


    let rec tokenizer s =
        let (ch, chs) = split s in
        match ch with
              ' ' ->    tokenizer chs
            | '(' ->    LParTk:: (tokenizer chs)
            | ')' ->    RParTk:: (tokenizer chs)
            | '+' ->    PlusTk::(tokenizer chs)
            | '-' ->    MinusTk::(tokenizer chs)
            | '*' ->    TimesTk::(tokenizer chs)
            | '^' ->    PowerTk::(tokenizer chs)
            | '/' ->    DividesTk::(tokenizer chs)
            | '=' ->    AssignTk::(tokenizer chs)
            | ch when (ch >= 'A' && ch <= 'Z') ||
                      (ch >= 'a' && ch <= 'z') ->
                        let (id_str, chs) = get_id_str s
                        in (Keyword_or_Id id_str)::(tokenizer chs) 
            | ch when (ch >= '0' && ch <= '9') ->
                        let (fl_str, chs) = get_float_str s
                        in (RealTk (float (fl_str)))::(tokenizer chs)
            | '$' ->    if chs = "" then [] else raise (SyntaxError (""))
            | _ ->      raise (SyntaxError (SyntErr ()))
        ;;
Hint, this isn't Rust.
replies(1): >>45051601 #
lmm ◴[] No.45051601[source]
Yes, the secret of Rust is that it offers both a) some important but slightly subtle language features from the late '70s that were sadly not present in Algol '52 and are therefore missing from popular lineages b) a couple of party tricks, in particular the ability to outperform C on silly microbenchmarks; b) is what leads people to adopt it and a) is what makes it non-awful to program in. Yes it's a damning indictment of programming culture than people did not adopt pre-Rust ML-family languages, but it could be worse, they could be not adopting Rust either.
replies(4): >>45051692 #>>45052023 #>>45052039 #>>45053227 #
1. pjmlp ◴[] No.45052023[source]
C only got to its performance state, when optimizing compilers taking advantage of UB started being common thing, during the 8 and 16 bit home computer days they were hardly any better than writing Assembly by hand, hence why books like Zen of Assembly Language left such a mark.

So if we are speaking of optimizing compilers there is MLton, while ensuring that the application doesn't blow up in strange ways.

The problem is not people getting to learn these features from Rust, glad that they do, the issue is that they think Rust invented them.

replies(1): >>45052073 #
2. ModernMech ◴[] No.45052073[source]
> the issue is that they think Rust invented them

Sorry, my post wasn't to imply Rust invented those things. My point was Rust's success as a language is due to those features.

Of course there's more to it, but what Rust really does right is blend functional and imperative styles. The "match" statement is a great way to bring functional concepts to imperative programmers, because it "feels" like a familiar switch statement, but with super powers. So it's a good "gateway drug" if you will, because the benefit is quickly realized ("Oh, it caught that edge case for me before it became a problem at runtime, that would have been a headache...").

From there, you can learn how to use match as an expression, and then you start to wonder why "if" isn't an expression in every language. After that you're hooked.