←back to thread

72 points _vvhw | 1 comments | | HN request time: 0.198s | source
Show context
GlitchMr ◴[] No.21069669[source]
What does this have to do with E2E? I don't see how filtering HTML is harder to do - even if somehow server-side algorithms are better (which this presentation seems to imply), cannot the same algorithm be used client-side?

In a way, the situation is better client-side, because when running code on the client's side, you can check how exactly the browser parses the HTML code.

replies(2): >>21069915 #>>21070003 #
1. _vvhw ◴[] No.21070003[source]
"even if somehow server-side algorithms are better (which this presentation seems to imply)"

The slides provide several reasons why server-side algorithms are worse.

"the situation is better client-side, because when running code on the client's side, you can check how exactly the browser parses the HTML code."

Yes, and for this reason, DOMPurify is a client-side sanitizer.