←back to thread

517 points bkolobara | 1 comments | | HN request time: 0s | source
Show context
BinaryIgor ◴[] No.45042483[source]
Don't most of the benefits just come down to using a statically typed and thus compiled language? Be it Java, Go or C++; TypeScript is trickier, because it compiles to JavaScript and inherits some issues, but it's still fine.

I know that Rust provides some additional compile-time checks because of its stricter type system, but it doesn't come for free - it's harder to learn and arguably to read

replies(17): >>45042692 #>>45043045 #>>45043105 #>>45043148 #>>45043241 #>>45043589 #>>45044559 #>>45045202 #>>45045331 #>>45046496 #>>45047159 #>>45047203 #>>45047415 #>>45048640 #>>45048825 #>>45049254 #>>45050991 #
arwhatever ◴[] No.45043148[source]
I might suspect that if you are lumping all statically-typed languages into a single bucket without making particular distinction among them, then you might not have fully internalized the implications of union (aka Rust enum aka sum) typed data structures combined with exhaustive pattern matching.

I like to call it getting "union-pilled" and it's really hard to accept otherwise statically-typed languages once you become familiar.

replies(3): >>45043455 #>>45043677 #>>45044134 #
ModernMech ◴[] No.45043677[source]
enums + match expressions + tagged unions are the secret sauce of Rust.
replies(2): >>45047622 #>>45050213 #
pjmlp ◴[] No.45050213[source]
Like this code snippet?

    (* Expressions *)

    type Exp = 
          UnMinus of Exp
        | Plus of Exp * Exp
        | Minus of Exp * Exp
        | Times of Exp * Exp
        | Divides of Exp * Exp
        | Power of Exp * Exp
        | Real of float 
        | Var of string
        | FunCall of string * Exp
        | Fix of string * Exp
        ;;


    let rec tokenizer s =
        let (ch, chs) = split s in
        match ch with
              ' ' ->    tokenizer chs
            | '(' ->    LParTk:: (tokenizer chs)
            | ')' ->    RParTk:: (tokenizer chs)
            | '+' ->    PlusTk::(tokenizer chs)
            | '-' ->    MinusTk::(tokenizer chs)
            | '*' ->    TimesTk::(tokenizer chs)
            | '^' ->    PowerTk::(tokenizer chs)
            | '/' ->    DividesTk::(tokenizer chs)
            | '=' ->    AssignTk::(tokenizer chs)
            | ch when (ch >= 'A' && ch <= 'Z') ||
                      (ch >= 'a' && ch <= 'z') ->
                        let (id_str, chs) = get_id_str s
                        in (Keyword_or_Id id_str)::(tokenizer chs) 
            | ch when (ch >= '0' && ch <= '9') ->
                        let (fl_str, chs) = get_float_str s
                        in (RealTk (float (fl_str)))::(tokenizer chs)
            | '$' ->    if chs = "" then [] else raise (SyntaxError (""))
            | _ ->      raise (SyntaxError (SyntErr ()))
        ;;
Hint, this isn't Rust.
replies(1): >>45051601 #
lmm ◴[] No.45051601[source]
Yes, the secret of Rust is that it offers both a) some important but slightly subtle language features from the late '70s that were sadly not present in Algol '52 and are therefore missing from popular lineages b) a couple of party tricks, in particular the ability to outperform C on silly microbenchmarks; b) is what leads people to adopt it and a) is what makes it non-awful to program in. Yes it's a damning indictment of programming culture than people did not adopt pre-Rust ML-family languages, but it could be worse, they could be not adopting Rust either.
replies(4): >>45051692 #>>45052023 #>>45052039 #>>45053227 #
1. ModernMech ◴[] No.45052039{4}[source]
> Yes it's a damning indictment of programming culture than people did not adopt pre-Rust ML-family languages, but it could be worse, they could be not adopting Rust either.

I'll say for a long time I've been quite pleased on the general direction of the industry in terms of language design and industry trends around things like memory safety. For a good many years we've seen functional features being integrated into popular imperative languages, probably since map/reduce became a thing thanks to Google. So I'll us all credit for coming around eventually.

I'm more dismayed by the recent AI trend of asking an AI to write Python code and then just going with whatever it outputs. I can't say that seems like a step forward.