←back to thread

158 points feep | 2 comments | | HN request time: 0s | source
Show context
cb321 ◴[] No.44465112[source]
Even back in the 1990s, CGI programs written in C were lightning fast. It just was (is) an error prone environment. Any safer modern alternative like the article's Go program or Nim or whatever not making database connections will be very fast & low latency to localhost - really similar to a CLI utility where you fork & exec. It's not free, but it's not that expensive compared to network latencies then or now.

People/orgs do tend to get kind of addicted to certain technologies that can interact poorly with the one-shot model, though. E.g., high start up cost Python interpreters with a lot of imports are still pretty slow, and people get addicted to that ecosystem and so need multi-shot/persistent alternatives.

The one-shot model in early HTTP was itself a pendulum swing from other concerns, e.g. ftp servers not having enough RAM for 100s of long-lived, often mostly idle logins.

replies(1): >>44465971 #
foobiekr ◴[] No.44465971[source]
You know, CGI with pre-forking (for latency hiding) and a safer language (like Rust) would be a great system to work on. Put the TLS termination in a nice multi-threaded web server (or in a layer like CloudFront).

No lingering state, very easy to dump a core and debug, nice mostly-linear request model (no callback chains, etc.) and trivially easy to scale. You're just reading from stdin and writing to stdout. Glorious. Websockets adds a bit of complexity but almost none.

The big change in how we build things was the rise of java. Java was too big, too bloated, too slow, etc. so people rapidly moved into multi-threaded application servers, all to avoid the cost of fork() and the dangers of C. We can Marie Kondo this shit and get back to things that are simple if we want to.

I don't even like Rust and this sounds like heaven to me. Maybe someone will come up with a way to make writing the kind of web-tier backend code in Rust easy by hiding a lot of the tediousness and/or complexity in a way that makes this appealing to node/js, php and python programmers.

replies(2): >>44466245 #>>44477572 #
cb321 ◴[] No.44466245[source]
This is not to disagree, but to agree adding some detail... :-)

Part of the Java rise was C/C++ being error prone and syntax similarity with such, but this was surely intermingled with a full scale marketing assault by Sun Microsystems who at the time had big multi-socket SMP servers they wanted to sell with Solaris/etc. and part of that was the Solaris/Java threading. Really for a decade or two prior to that the focus was on true MMU-based hardware-enforced isolation with OS kernel clean-up (more like CHERI these days) not the compiler-enforced stuff like Rust does.

I think you could have something more ergonomic than Perl/Python ever was and as practically fast as C/Rust with Nim (https://nim-lang.org/). E.g., I just copied that guy's benchmark with a Nim stdlib std/cgi and got over 275M CGI/day to localhost on a 2016 CPU doing only 2 requesters & 2 http server threads. With some nice DSL easily written if you don't like any current ones you could get the "coding overhead" down to a tiny footprint. In fairness I did zero SQLite whatever, but also he was using a computer over 4x bigger and probably a GHz faster with some IPC lift as well. So, IF you had the network bandwidth (hint - usually you don't!), you could probably support billions of hits/day off a single server.

To head off some lazy complaints, GC is just not an issue with a single threaded Nim program whose lifetime is hoped/expected to be short anyway. In many cases (just as with CLI utilities!) you could probably just let the OS reap memory, but, of course, it always "all depends" on a lot of context. Nim does reference counting anyway whereas most "fighting the GC" is actually fighting a "separate GC thread" (Java again, Go, D, etc.) trashing CPU caches or consuming DIMM bandwidth and so on. For this use, you probably would care more about a statically linked binary so you don't pay ld.so shared library set up overhead on every `exec`.

replies(1): >>44466335 #
jbverschoor ◴[] No.44466335[source]
There were quite a lot of ISA and OS platforms when Java started. Java made a lot of sense. It was that, and a superb standard library
replies(1): >>44466954 #
cb321 ◴[] No.44466954[source]
Ah. You refer to the "write once, run anywhere" marketing slogan at a time when there was a lot of JVM-to-JVM variability and JVM JITs were not very advanced. I didn't buy that slogan at the time (look at aaaaall that open source code running all over with ./configure) and always hated Java's crazy boilerplate verbosity combined with a culture to defend it as somehow "better".

I mean, it's not like people ran Internet servers on such vastly different CPUs/OSes or that diversity was such a disadvantage. DEC Alpha was probably the most different for its 64-bitness, but I ran all those open source Linux/C things on that by 1996..97. But we may just have to agree to disagree that it made a lot of sense for that reason. I have disagreements with several high profile SiValley "choices" and I know I'm a little weird.

Anyway, I don't mean to be arbitrarily disputatious. Focusing on what we do agree on, I agree 100% early Java stdlib's being bigger than C/C++ & early STL/template awfulness was a huge effect. :-) C++ like keywords and lexical sensibilities mattered, too. PLang researchers joked upon Java's success that C's replacement sure had to "look like C". But I think programmers having all their lib needs met with very little work matters even more and network-first package managers were just getting going. A perceived-as-good stdlib absolutely helps Go even today. Human network effects are a very real driver even among very talented engineers.

Maybe since CTAN/CPAN?, that often comes more from popularity and the ecosystem than "what's in the stdlib". Even before then there was netlib/fortran algorithm distribution, though. Node/Rust/Python worlds today show this.

How to drive popularity in a market of ideas competing for it is hard/finicky or the biggest marketing budget would always win and people could also always just "buy reputation" which empirically does not happen (though it sure happens sometimes which I guess shows ad spend is not wasted). Even so, "free advertising", "ecosystem builds going exponential", etc. - these are just tricky to induce.

The indisputable elephant in the room is path dependence. Fortran, still somewhat reflective of people "stacking punch card decks" to "link" programs in the 1950s, is still used by much modern scientific research either directly or indirectly. Folks were literally just figuring out what a PLang should be and how interacting with these new things called "computers" might work.

But path dependence is everywhere all around us.. in institutions, traditions, and technology. It's all really a big Humanity Complete discussion that spirals into a cluster of Wicked Problems. Happens so fast on so many topics. :-) If you happen to make something that catches on, let us hope you didn't make too many mistakes that get frozen in! Cheers!

replies(1): >>44470838 #
1. jbverschoor ◴[] No.44470838[source]
I mean, that slogan still holds true 30 years later.

I’ve never used Fortran.. I started out with basic and wanted to do C/C++ because of gamedev. I loved the syntax. Perhaps bc it made me feel smart. The C-style syntax is also big part what made Java popular.

However, I think more and more than the popularity is because Java was the de facto standard to teach at universities. The reason for that was that it was a very clear language, and easy to compile, unlike C, which has so many behavioral quirks, and ofc memory management.

These days Python is being taught in both CS and statistics, which creates a broader market and broader usage spectrum.

Ruby, in my opinion, got big because of rails, but loved because of the stdlib

replies(1): >>44471593 #
2. cb321 ◴[] No.44471593[source]
Besides already agreeing to disagree on the Java slogan & syntax, I agree with all you say. Familiarity/being taught in schools is for sure a huge deal. { Not really "news" - Steve Jobs used to give high schools Apple IIs for the same reason. :-) } When even Scheme-inventing MIT moved to Python the writing was on the wall.

I've always been a little surprised Nim wasn't more popular with its Python like syntax, Lisp like meta-power, C-like speed and usually automatic memory management. Ah well.