A comprehensive library might offer a more neat DX, but you'd have to ship library code you don't use. (Yes, tree-shaking exists, but still is tricky and not widespread.)
On the server side, of course, you can do whatever you like, see Node / Deno / Bun. But the code bundle size plays a minor role there.
If it's the browser's job to implement the standard library, how do you ensure that all browsers do this in a compliant and timely fashion? And if not, how do you optimise code-on-demand delivery over the internet?
I don't deny there are/could be solutions to this. But historically JS devs have wrestled with these issues as best they can and that has shaped what we see today.
What is this unknown runtime environment? Even during the browser war, there was just an handful of browsers. And IE was the only major outlier. Checking the existence of features and polyfilling is not that complicated.
And most time, the browser is already downloading lot of images and other resources. Arguing about bundle size is very hypocritical of developers that won't blink at adding 17 analytics modules.
Judging by what we see in the world, most developers don't agree with you. And neither do I. A handful of browsers, multiplied by many versions per browser in the wild (before evergreen browsers like Chrome became widespread, but even today with e.g. Safari, or enterprise users), multiplied by a sprawling API surface (dare I say it, a standard library) is not trivial. And that's not even considering browser bugs and regressions.
> very hypocritical of developers that won't blink
Not a great argument, as developers don't necessarily get to choose how to add analytics, and plenty of them try to push back against doing so.
Also, the cost of parsing and JIT'ing JS code is byte-for-byte different to the cost of decoding an image.
From my POV, most developers just test on the most popular browser (and the latest version of that) without checking if the API is standard or its changelog. Or they do dev on the most powerful laptop while the rest of the world is still on 8gb, FHD screen with integrated gpu.
Alternatively, because there are now (often ridiculous) build systems and compilation steps, we might expect similar behavior to other compiled binaries. Instead we get the worst of both worlds.
Yes, JS as it is is some kind of standard, but at a certain point we might ask, "Why not throw out the bad designs and start from scratch?" If it takes ten years to sunset the garbage and offer a compatibility shim, that's fine. All the more reason to start now.
A purely compiled WASM approach with first class DOM access or a clean scripting language with a versioned standard lib, either option would be better than the status quo.
I would love to see if a browser could like... "disaggregate" itself into WASM modules. E.g. why couldn't new JS standards be implemented in WASM and hot loaded into the browser itself from a trusted distributor when necessary?
Missing CSS Level 5 selectors? Browser goes and grabs the reference implementation from the W3C.
Low-level implementations could replace these for the browsers with the most demanding performance goals, but "everyone else" could benefit from at least remaining spec compatible?
(I guess this begs the question of "what's the API that these WASM modules all have to conform to" but I dunno, I find it an interesting thought.)