←back to thread

53 points heavensteeth | 5 comments | | HN request time: 0.869s | source
Show context
ryao ◴[] No.43652062[source]

  If you find yourself with a scripted language where processing HTTP requests might be too slow or unsafe, I can still see some utility for FastCGI. For most of the rest of us, HTTP won, just write little HTTP webservers.
Without serious development effort, I would expect using an existing web server with FastCGI is faster than writing your own web server. It is also more secure as the FastCGI application can be run as a different user in a chroot, or a namespace based sandbox like a docker container.
replies(4): >>43652278 #>>43652341 #>>43654110 #>>43671603 #
1. usef- ◴[] No.43652278[source]
I was thinking that.

Was FastCGI a child of a world where we had neither good library use ("import net/http"), nor (much) layering in front of the server (balancers / cdns / cloudflare etc). So it made sense to assume a production-level layer on the box itself was always needed?

I remember the vigorous discussions comparing security of Apache vs IIS etc

replies(3): >>43652393 #>>43652689 #>>43653386 #
2. phire ◴[] No.43652393[source]
Partly.

But I suspect it's more that CGI was the way things had always been done. They didn't even consider doing a reverse proxy. They asked the question "how do we make CGI faster" and so ended up with FastCGI.

Other developers asked the same question and ended up making mod_php (and friends), embedding the scripting language directly into the web server.

3. chasd00 ◴[] No.43652689[source]
Iirc most of the content was static html/css in those days. Running code on a request was rare so cgi was like a bolt on to a static content server. It was available but not the norm. Perl and php gradually made it the norm to run code on every request.
replies(1): >>43652966 #
4. mdpye ◴[] No.43652966[source]
I remember in the very early days as a hobbyist working with cgi perl scripts for forums or guest books where the script just edited the "static" content in place.

The script would write new html files for new posts and do "fun" (I mean, terrifying) string manipulation on the main index to insert links to posts etc. Sometimes they used comments with metadata to help "parse" pages which would see edits.

These both were, and definitely were not, "the days" :D

5. BobbyTables2 ◴[] No.43653386[source]
Sure beats maintaining a custom webserver written in C!