Most active commenters
  • maxbond(3)

←back to thread

177 points signa11 | 14 comments | | HN request time: 1.464s | source | bottom
1. mjevans ◴[] No.42160806[source]
This makes me further appreciate how golang's features tend to work entirely at compile time, which is also fast.

One of the other things that makes me worry about Rust is how similar it's depends look to npm projects, where there's a kitchen sink of third party (not the language's included library of code, and not the project's code) libraries pulled in for seemingly small utilities.

replies(4): >>42160836 #>>42160837 #>>42160849 #>>42160939 #
2. danpalmer ◴[] No.42160836[source]
Go’s features work only at compile time, but are far more limited. I experience more crashes in Go than in any other compiled language because of how limited it is as a language.
3. kstrauser ◴[] No.42160837[source]
Rust’s checks are also evaluated and enforced at compile time.
4. hypeatei ◴[] No.42160849[source]
Dependencies are optional, and having a huge standard library also has its tradeoffs. If the standard library has a less than ideal API, it's stuck with that until a major version bump and you either:

1. End up with a third party package filling in the gaps, or

2. Another standard library API that users slowly migrate to

replies(2): >>42160871 #>>42162412 #
5. saghm ◴[] No.42160871[source]
It's also a lot easier to release a new version of a package to fix a bug than do a bugfix release for the entire language toolchain, which is what would be needed in order to update the standard library. With Rust releasing a new minor version every six weeks, I think minimizing the chances of additional releases needed in between them is probably a good thing.
6. maxbond ◴[] No.42160939[source]
I think it's the natural state of affairs for a "folk standard library" to emerge. I don't think pydantic or serde should be part of their standard libraries. But I will use them in most projects. In ten years, the "folk stdlib" will probably be a different set of packages (perhaps a superset, perhaps not). Don't push the river; if it's natural, manage it rather than fighting it.

Trying to anticipate all or even most use cases in the standard library is a fool's errand (unless we're talking about a DSL, of course). There are too many and they are too dynamic to be captured in the necessarily conservative release process of a language implementation. Languages should focus on being powerful and flexible enough to be adapted to a wide variety of use cases, and let the community of package maintainers handle the implementation. Think of this as a special case of the Unix philosophy; languages should do one thing very well, not a million things unevenly.

I bet most people here don't believe a command economy could ever work in a market for goods and services. Why should it work in a marketplace of ideas?

replies(1): >>42161950 #
7. riwsky ◴[] No.42161950[source]
And new cars tailored for each consumer’s use case will emerge in ten years, too—that doesn’t make it any less awful to live in areas that lack good public transportation.
replies(1): >>42162062 #
8. maxbond ◴[] No.42162062{3}[source]
I'm not sure I understand the metaphor? Let me know if I'm off base.

If the suggestion is that putting things in the standard library makes them better, I disagree. My experience with Python for instance is that a "batteries included" strategy results in some phenomenal packages and some borderline abandoned packages that are actively dangerous to use.

To riff on your metaphor, the federal government designs the arterial highways, but the state, country, and city/town officials design the minutia of the traffic system. If the federal government had to approve spending on replacing some street signs or plowing snow, we would have a terribly impoverished transportation system.

replies(1): >>42166259 #
9. tsimionescu ◴[] No.42162412[source]
Unfortunately, in a world with increasingly more sophisticated attackers looking at supply chain attacks, having a lot of dependencies, especially ones that update regularly, is a huge security risk. For a language like Rust, which aims to be both low level and used in secure environments, I would argue that the risks far outweigh the benefits.

We'll see how this works, Rust is still young and not yet used in any hugely important projects (or at least not in hugely important parts of those projects - e.g. some Linux drivers, not the core kernel; some bits of Firefox'S rendering, not the JS engine). As it becomes more central, it's value as an attack target will increase, and people will start taking infiltrating malicious code in small but widely used dependencies.

replies(2): >>42162816 #>>42166078 #
10. ◴[] No.42162816{3}[source]
11. hypeatei ◴[] No.42166078{3}[source]
The same could be said for all of the utilities used on Linux (that they're increasingly becoming huge targets) as seen by the recent XZ backdoor[0]. The open source model of limited funding and maintainer burnout are an inherent risk to any project. Rust is not special here.

0: https://en.wikipedia.org/wiki/XZ_Utils_backdoor

replies(1): >>42173198 #
12. riwsky ◴[] No.42166259{4}[source]
The suggestion is that, while it is possible to overdo a stdlib, it is also possible to underdo it.

For two examples: plenty of languages leave auto-formatting and testing to the community, functionality which rust is better for having standardized.

replies(1): >>42167214 #
13. maxbond ◴[] No.42167214{5}[source]
I agree.
14. tsimionescu ◴[] No.42173198{4}[source]
The point was to compare a fat standard library model with a model of many dedicated third party libraries. Sure, all big projects suffer from this, I'm not trying to single Rust out. But both Rust and those other projects have to learn from this that the model is bad.

Having a large and good standard library, supplied by a single trustworthy foundation, with dedicated employees that check incoming PRs, is going to become more and more important in the following years.