←back to thread

418 points akagusu | 1 comments | | HN request time: 0s | source
Show context
nwellnhof ◴[] No.45955183[source]
Removing XSLT from browsers was long overdue and I'm saying that as ex-maintainer of libxslt who probably triggered (not caused) this removal. What's more interesting is that Chromium plans to switch to a Rust-based XML parser. Currently, they seem to favor xml-rs which only implements a subset of XML. So apparently, Google is willing to remove standards-compliant XML support as well. This is a lot more concerning.
replies(11): >>45955239 #>>45955425 #>>45955442 #>>45955667 #>>45955747 #>>45955961 #>>45956057 #>>45957011 #>>45957170 #>>45957880 #>>45977574 #
jillesvangurp ◴[] No.45955442[source]
> This is a lot more concerning.

I'm not so sure that's problematic. Probably browser just aren't a great platform for doing a lot of XML processing at this point.

Preserving the half implemented frozen state of the early 2000s really doesn't really serve anyone except those maintaining legacy applications from that era. I can see why they are pulling out complex C++ code related to all this.

It's the natural conclusion of XHTML being sidelined in favor of HTML 5 about 15-20 years ago. The whole web service bubble, bloated namespace processing, and all the other complexity that came with that just has a lot of gnarly libraries associated with it. The world kind of has moved on since then.

From a security point of view it's probably a good idea to reduce the attack surface a bit by moving to a Rust based implementation. What use cases remain for XML parsing in a browser if XSLT support is removed? I guess some parsing from javascript. In which case you could argue that the usual solution in the JS world of using polyfills and e.g. wasm libraries might provide a valid/good enough alternative or migration path.

replies(3): >>45962438 #>>45974854 #>>45977639 #
1. pjmlp ◴[] No.45962438[source]
This is C code, https://gitlab.gnome.org/GNOME/libxslt