←back to thread

517 points bkolobara | 1 comments | | HN request time: 0s | source
Show context
BinaryIgor ◴[] No.45042483[source]
Don't most of the benefits just come down to using a statically typed and thus compiled language? Be it Java, Go or C++; TypeScript is trickier, because it compiles to JavaScript and inherits some issues, but it's still fine.

I know that Rust provides some additional compile-time checks because of its stricter type system, but it doesn't come for free - it's harder to learn and arguably to read

replies(17): >>45042692 #>>45043045 #>>45043105 #>>45043148 #>>45043241 #>>45043589 #>>45044559 #>>45045202 #>>45045331 #>>45046496 #>>45047159 #>>45047203 #>>45047415 #>>45048640 #>>45048825 #>>45049254 #>>45050991 #
arwhatever ◴[] No.45043148[source]
I might suspect that if you are lumping all statically-typed languages into a single bucket without making particular distinction among them, then you might not have fully internalized the implications of union (aka Rust enum aka sum) typed data structures combined with exhaustive pattern matching.

I like to call it getting "union-pilled" and it's really hard to accept otherwise statically-typed languages once you become familiar.

replies(3): >>45043455 #>>45043677 #>>45044134 #
ModernMech ◴[] No.45043677[source]
enums + match expressions + tagged unions are the secret sauce of Rust.
replies(2): >>45047622 #>>45050213 #
pjmlp ◴[] No.45050213[source]
Like this code snippet?

    (* Expressions *)

    type Exp = 
          UnMinus of Exp
        | Plus of Exp * Exp
        | Minus of Exp * Exp
        | Times of Exp * Exp
        | Divides of Exp * Exp
        | Power of Exp * Exp
        | Real of float 
        | Var of string
        | FunCall of string * Exp
        | Fix of string * Exp
        ;;


    let rec tokenizer s =
        let (ch, chs) = split s in
        match ch with
              ' ' ->    tokenizer chs
            | '(' ->    LParTk:: (tokenizer chs)
            | ')' ->    RParTk:: (tokenizer chs)
            | '+' ->    PlusTk::(tokenizer chs)
            | '-' ->    MinusTk::(tokenizer chs)
            | '*' ->    TimesTk::(tokenizer chs)
            | '^' ->    PowerTk::(tokenizer chs)
            | '/' ->    DividesTk::(tokenizer chs)
            | '=' ->    AssignTk::(tokenizer chs)
            | ch when (ch >= 'A' && ch <= 'Z') ||
                      (ch >= 'a' && ch <= 'z') ->
                        let (id_str, chs) = get_id_str s
                        in (Keyword_or_Id id_str)::(tokenizer chs) 
            | ch when (ch >= '0' && ch <= '9') ->
                        let (fl_str, chs) = get_float_str s
                        in (RealTk (float (fl_str)))::(tokenizer chs)
            | '$' ->    if chs = "" then [] else raise (SyntaxError (""))
            | _ ->      raise (SyntaxError (SyntErr ()))
        ;;
Hint, this isn't Rust.
replies(1): >>45051601 #
lmm ◴[] No.45051601[source]
Yes, the secret of Rust is that it offers both a) some important but slightly subtle language features from the late '70s that were sadly not present in Algol '52 and are therefore missing from popular lineages b) a couple of party tricks, in particular the ability to outperform C on silly microbenchmarks; b) is what leads people to adopt it and a) is what makes it non-awful to program in. Yes it's a damning indictment of programming culture than people did not adopt pre-Rust ML-family languages, but it could be worse, they could be not adopting Rust either.
replies(4): >>45051692 #>>45052023 #>>45052039 #>>45053227 #
1. gf000 ◴[] No.45053227{4}[source]
I mean, "fearless concurrency", while a hyped-up phrase that is definitely exaggerated, compared to the C-world where you are already blowing off your leg in single-threaded code, let alone thinking of multiple threads Rust is an insanely huge win. And it shows on e.g. the small Unix tool rewrites.

Sure, rewrites are most often better on simply being a rewrite, but the kind of parallel processing they do may not be feasible in C.