←back to thread

72 points _vvhw | 1 comments | | HN request time: 0.269s | source
Show context
zawerf ◴[] No.21069377[source]
I am always irrationally(?) scared of using these sanitizers despite their successful history. As soon as new html/js/css syntax/features are introduced, won't your security model need to be reevaluated? Which seems like a lost cause at the rate new capabilities are introduced to the web. E.g., when CSS Shaders lands, you might be able to execute arbitrary gpu code with just css (hypothetically speaking, I don't actually know how it will work. I am sure it'll be sandboxed pretty well. But the problem remains that there are too many new possibilities to keep up with!).
replies(4): >>21069454 #>>21069510 #>>21069644 #>>21071017 #
megous ◴[] No.21069454[source]
Make it a whitelist. :)
replies(1): >>21069557 #
zawerf ◴[] No.21069557[source]
It wouldn't help if new features extend the capabilities of existing stuff (which is done all the time). For example the CSS Shader example from before adds new syntax to the existing 'filter' css style, which you might've already whitelisted because it is safe today.
replies(1): >>21069693 #
ShaneCurran ◴[] No.21069693[source]
I guess a nested, parameter-granularity whitelist would work in that case :)
replies(1): >>21070036 #
1. _vvhw ◴[] No.21070036[source]
You can do that with DOMPurify using hooks.