←back to thread

157 points lladnar | 3 comments | | HN request time: 0.545s | source
1. imiric ◴[] No.41864401[source]
These findings are so unsurprising that the research is borderline boring.

What I would like to see are similar efforts directed at the tower of complexity that is the modern TLS stack. From the Snowden leaks we know that the NSA has tried to break cryptographic algorithms for decades via their project Bullrun, and that they bribed the RSA to default to their compromised algorithm. From the recent XZ incident we also know that supply chain attacks can be very sophisticated and difficult to detect.

How likely is it that the protocols we consider secure today are silently compromised by an undetected agent? Should we just assume that they are, like a sibling comment suggested?

I'm frankly more interested in knowing if there is oversight of these complex technologies that could possibly alert us of any anomalies of this type, so that we don't have to rely on whistleblowers or people who happen to notice strange behavior and decide to look into it out of curiosity. Too much is at stake for this to be left up to chance.

replies(2): >>41870473 #>>41871553 #
2. lazide ◴[] No.41870473[source]
Oversight, yes mostly. The issue is that the stack is very complex, and who watches/pays the watchers?
3. toast0 ◴[] No.41871553[source]
Most of the things people get dinged for in this kind of report are things that were already fixed in modern TLS.

If you set your clients and servers to TLS 1.3 only (which I consider the modern TLS stack), you only have a handful of ciphers to choose from (AES128-GCM, AES256-GCM, and ChaCha20-Poly1305), which avoids any issues with CBC constructions. Most of your issues are going to be around x.509 certificate processing, because TLS protocol and ciphers are easier to use correctly than in the past, but x.509 hasn't changed significantly.