←back to thread

72 points _vvhw | 2 comments | | HN request time: 0s | source
Show context
GlitchMr ◴[] No.21069669[source]
What does this have to do with E2E? I don't see how filtering HTML is harder to do - even if somehow server-side algorithms are better (which this presentation seems to imply), cannot the same algorithm be used client-side?

In a way, the situation is better client-side, because when running code on the client's side, you can check how exactly the browser parses the HTML code.

replies(2): >>21069915 #>>21070003 #
bugmen0t ◴[] No.21069915[source]
It's in page 18. If you have end-to-end encryption you can't sanitize in the client.

I mean, you're really just summarizing the presentation. It should be an API that's in the browser. It isn't. So people need to use a library. That's OK. But not great.

replies(1): >>21069991 #
1. _vvhw ◴[] No.21069991[source]
"If you have end-to-end encryption you can't sanitize in the client."

I think you meant to type that you can't sanitize in the "server"? Because with end-to-end encryption the server has no access to the plaintext to be sanitized. Only the client can sanitize, only the client has the plaintext.

replies(1): >>21071442 #
2. bugmen0t ◴[] No.21071442[source]
oops yes.