←back to thread

1369 points universesquid | 1 comments | | HN request time: 0s | source
Show context
stathibus ◴[] No.45169926[source]
As an outsider to the npm ecosystem, reading this list of packages is astonishing. Why do js people import someone else's npm module for every little trivial thing?
replies(11): >>45169990 #>>45169999 #>>45170008 #>>45170014 #>>45170015 #>>45170016 #>>45170038 #>>45170063 #>>45170879 #>>45170926 #>>45170953 #
nine_k ◴[] No.45170008[source]
Having a module for every little trivial thing allows you to only bring these modules inside the JS bundle you serve to your client. If there's a problem in one trivial-thing function, other unrelated trivial things can still be used, because they are not bundled in the same package.

A comprehensive library might offer a more neat DX, but you'd have to ship library code you don't use. (Yes, tree-shaking exists, but still is tricky and not widespread.)

replies(3): >>45170169 #>>45171158 #>>45171236 #
palmfacehn ◴[] No.45171158[source]
Things like this are good illustrations as to why many feel that the entire JS ecosystem is broken. Even if you have a standard lib included in a language, you wouldn't expect a bigger binary because of the standard lib. The JS solution is often more duct tape on top of a bad design. In this case tree shaking, which may or may not work as intended.
replies(2): >>45172102 #>>45174402 #
crabmusket ◴[] No.45174402[source]
I agree with you, but I'd ask- what other language needs to distribute to an unknown runtime environment over the network?

If it's the browser's job to implement the standard library, how do you ensure that all browsers do this in a compliant and timely fashion? And if not, how do you optimise code-on-demand delivery over the internet?

I don't deny there are/could be solutions to this. But historically JS devs have wrestled with these issues as best they can and that has shaped what we see today.

replies(3): >>45175045 #>>45177500 #>>45186443 #
palmfacehn ◴[] No.45177500[source]
A batteries included standard lib included with the runtime is one approach. Yes, you would know upfront the version which the browser implements. From there you could dynamically load a polyfill or prompt the user to upgrade.

Alternatively, because there are now (often ridiculous) build systems and compilation steps, we might expect similar behavior to other compiled binaries. Instead we get the worst of both worlds.

Yes, JS as it is is some kind of standard, but at a certain point we might ask, "Why not throw out the bad designs and start from scratch?" If it takes ten years to sunset the garbage and offer a compatibility shim, that's fine. All the more reason to start now.

A purely compiled WASM approach with first class DOM access or a clean scripting language with a versioned standard lib, either option would be better than the status quo.

replies(1): >>45177612 #
crabmusket ◴[] No.45177612[source]
> A purely compiled WASM approach

I would love to see if a browser could like... "disaggregate" itself into WASM modules. E.g. why couldn't new JS standards be implemented in WASM and hot loaded into the browser itself from a trusted distributor when necessary?

Missing CSS Level 5 selectors? Browser goes and grabs the reference implementation from the W3C.

Low-level implementations could replace these for the browsers with the most demanding performance goals, but "everyone else" could benefit from at least remaining spec compatible?

(I guess this begs the question of "what's the API that these WASM modules all have to conform to" but I dunno, I find it an interesting thought.)

replies(1): >>45179698 #
1. palmfacehn ◴[] No.45179698[source]
Yes, that would be a compelling change. Like a language agnostic HotJava platform. We're overdue for a more coherent approach, from the bottom up.