←back to thread

392 points _kush | 4 comments | | HN request time: 0.507s | source
Show context
badmintonbaseba ◴[] No.44394985[source]
I have worked for a company that (probably still is) heavily invested in XSLT for XML templating. It's not good, and they would probably migrate from it if they could.

  1. Even though there are newer XSLT standards, XSLT 1.0 is still dominant. It is quite limited and weird compared to the newer standards.

  2. Resolving performance problems of XSLT templates is hell. XSLT is a Turing-complete functional-style language, with performance very much abstracted away. There are XSLT templates that worked fine for most documents, but then one document came in with a ~100 row table and it blew up. Turns out that the template that processed the table is O(N^2) or worse, without any obvious way to optimize it (it might even have an XPath on each row that itself is O(N) or worse). I don't exactly know how it manifested, but as I recall the document was processed by XSLT for more than 7 minutes.
JS might have other problems, but not being able to resolve algorithmic complexity issues is not one of them.
replies(9): >>44395187 #>>44395285 #>>44395306 #>>44395323 #>>44395430 #>>44395839 #>>44396146 #>>44397330 #>>44398324 #
1. bambax ◴[] No.44396146[source]
> XSLT 1.0 is still dominant

How, where? In 2013 I was still working a lot with XSLT and 1.0 was completely dead everywhere one looked. Saxon was free for XSLT 2 and was excellent.

I used to do transformation of both huge documents, and large number of small documents, with zero performance problems.

replies(3): >>44396503 #>>44396640 #>>44399764 #
2. PantaloonFlames ◴[] No.44396503[source]
I recently had the occasion to work with a client that was heavily invested in XML processing for a set of integrations. They’re migrating / modernizing but they’re so heavily invested in XSL that they don’t want to migrate away from it. So I conducted some perf tests and, the performance I found for xslt in .NET (“core”) was slightly to significantly better than the performance of Java (current) and Saxon. But they were both fast.

In the early days the xsl was all interpreted. And was slow. From ~2004 or so, all the xslt engines came to be jit compiled. XSL benchmarks used to be a thing, but rapidly declined in value from then onward because the perf differences just stopped mattering.

3. pmarreck ◴[] No.44396640[source]
Probably corps. I was working at Factset in the early 2000's when there was a big push for it and I imagine the same thing was reflected across every Microsoft shop across corporate America at the time, which (at the time) Microsoft was winning big marketshare in. (I bet there are still a ton of internal web apps that only work with IE... sigh)

Obviously, that means there's a lot of legacy processes likely still using it.

The easiest way to improve the situation seems to be to upgrade to a newer version of XSLT.

4. int_19h ◴[] No.44399764[source]
In the browsers.