←back to thread

418 points akagusu | 5 comments | | HN request time: 0s | source
Show context
Aurornis ◴[] No.45955140[source]
I have yet to read an article complaining about XSLT deprecation from someone who can explain why they actually used it and why it’s important to them.

> I will keep using XSLT, and in fact will look for new opportunities to rely on it.

This is the closest I’ve seen, but it’s not an explanation of why it was important before the deprecation. It’s a declaration that they’re using it as an act of rebellion.

replies(10): >>45955238 #>>45955283 #>>45955351 #>>45955795 #>>45955805 #>>45955821 #>>45956141 #>>45956722 #>>45956976 #>>45958239 #
James_K ◴[] No.45955821[source]
I use XSLT because I want my website to work for users with JavaScript disabled and I want to present my Atom feed link as an HTML document on a statically hosted site without breaking standards compliance. Hope this helps.
replies(2): >>45955882 #>>45958444 #
matthews3 ◴[] No.45955882[source]
Could you run XSLT as part of your build process, and serve the generated HTML?
replies(4): >>45955943 #>>45955956 #>>45956760 #>>45959294 #
James_K ◴[] No.45955943{3}[source]
No because then it would not be an Atom feed. Atom is a syndication format, the successor to RSS. I must provide users with a link to a valid Atom XML document, and I want them to see a web page when this link is clicked.

This is why so many people find this objectionable. If you want to have a basic blog, you need some HTML docments and and RSS/Atom feed. The technologies required to do this are HTML for the documents and XSLT to format the feed. Google is now removing one of those technologies, which makes it essentially impossible to serve a truly static website.

replies(2): >>45955974 #>>45956484 #
ErroneousBosh ◴[] No.45955974{4}[source]
> Google is now removing one of those technologies, which makes it essentially impossible to serve a truly static website.

How so? You're just generating static pages. Generate ones that work.

replies(1): >>45956162 #
James_K ◴[] No.45956162{5}[source]
You cannot generate a valid RRS/Atom document which also renders as HTML.
replies(1): >>45956531 #
shadowgovt ◴[] No.45956531{6}[source]
So put them on separate pages because they are separate protocols (HTML for the browser and XML for a feed reader), with a link on the HTML page to be copied and pasted into a feed reader.

It really feels like the developer has over-constrained the problem to work with browsers as they are right now in this context.

replies(1): >>45956725 #
kuschku ◴[] No.45956725{7}[source]
> So put them on separate pages because they are separate protocols

Would you also suggest I use separate URLs for HTTP/2 and HTTP/1.1? Maybe for a gzipped response vs a raw response?

It's the same content, just supplied in a different format. It should be the same URL.

replies(3): >>45956969 #>>45958694 #>>45959471 #
shadowgovt ◴[] No.45959471{8}[source]
Then the server should supply the right format based on the `Accept` header, be it `application/rss+xml` or `application/atom+xml` or `text/xml` or `text/html`.

Even cheaper than shipping the client an XML and an XSLT is just shipping them the HTML the XSLT would output in the first place.

replies(2): >>45960160 #>>45972299 #
James_K ◴[] No.45972299{9}[source]
So in other words, no more static sites?
replies(1): >>45974888 #
1. shadowgovt ◴[] No.45974888{10}[source]
A static site can inspect headers. Static sites still have a web server.
replies(1): >>45975872 #
2. James_K ◴[] No.45975872[source]
A static site cannot inspect headers. There is no HTML, or even JavaScript function you can put in a file to inspect the headers before the file is sent to the client.

A static site is a collection of static files. It doesn't need a server, you could just open it locally (in browsers that don't block file:// URI schemes). If you need some special configuration of the server, it is no longer a static site. The server is dynamically selecting which content is served.

replies(1): >>45978746 #
3. shadowgovt ◴[] No.45978746[source]
Oh, difference in definitions. You mean "non-configurable web server." Because you could definitely use a static site generator to create multiple versions of the site data and then configure your web server to select which data is emitted.

But agreed; if your web server is just reflecting the filesystem, add this to the pile of "things that are hard with that kind of web server." But perhaps worth noting: even Apache and Python's http.server can select the file to emit based on the Accept header.

replies(1): >>45979560 #
4. James_K ◴[] No.45979560{3}[source]
A static site is one that you can serve through static hosting, where you have no control over the web server or its configuration. There is not some extra thing which is a static site with dynamic content. “Static” means “doesn't change.” The document served doesn't change subject to the person receiving it. You are talking about a solution that is dynamic. That does change based on who is making the request.

>you could definitely use a static site generator to create multiple versions of the site data and then configure your web server to select which data is emitted

And this web-server configuration would not exist within the static site. The static site generator could not output it, therefore it is not a part of the static site. It is not contained within the files output by the static site generator. It is additional dynamic content added by the web server.

It breaks the fundamental aspect of a static site, that it can be deployed simply to any service without change to the content. Just upload a zip file, and you are done.

replies(1): >>45981418 #
5. shadowgovt ◴[] No.45981418{4}[source]
Like I said, difference in definitions. https://www.google.com/search?q=static+site+serving+with+apa...

I get your meaning; I've just heard "static site" used to refer to a site where the content isn't dynamically computed at runtime, not a site where the server is doing a near-direct-mapping from the filesystem to the HTTP output.

> Just upload a zip file, and you are done.

This is actually how I serve my static sites via Dreamhost. The zipfile includes the content negotiation rules in the `.htaccess` file.

(Perhaps worth remembering: even the rule "the HTTP responses are generated by looking up a file matching the path in the URL and echoing that file as the body of the GET response" is still a per-server rule; there's no aspect of the HTTP spec that declares "The filesystem is directly mirrored to web access" is a thing. It's rather a protocol used by many simple web servers, and most of them allow overrides to do something slightly more complicated while being one step away from "this is just the identity function on whatever is in your filesystem, well, not technically the identity function because unless someone did something very naughty, I don't serve anything for http://example.com/../uhoh-now-i-am-in-your-user-directory").