←back to thread

182 points yarapavan | 1 comments | | HN request time: 0.001s | source
Show context
neuroelectron ◴[] No.43616167[source]
Very suspicious article. Sounds like the "nothing to see here folks, move along" school of security.

Reproducibility is more like a security smell; a symptom you’re doing things right. Determinism is the correct target and subtly different.

The focus on supply chain is a distraction, a variant of The “trusting trust” attack Ken Thompson described in 1984 is still among the most elegant and devastating. Infected development toolchains can spread horizontally to “secure” builds.

Just because it’s open doesn’t mean anyone’s been watching closely. "50 years of security"? Important pillars of OSS have been touched by thousands of contributors with varying levels of oversight. Many commits predate strong code-signing or provenance tracking. If a compiler was compromised at any point, everything it compiled—including future versions of itself—could carry that compromise forward invisibly. This includes even "cleanroom" rebuilds.

replies(4): >>43616257 #>>43617725 #>>43621870 #>>43622202 #
lrvick ◴[] No.43616257[source]
The best defense we have against the Trusting Trust attack is full source bootstrapping, now done by two distros: Guix and Stagex.
replies(2): >>43616330 #>>43625793 #
AstralStorm ◴[] No.43616330[source]
No you do not. If you have not actually validated each and every source package your trust is only related to the generated binaries corresponding to the sources you had. The trusting trust attack was deployed against the source code of the compiler, poisoning specific binaries. Do you know if GCC 6.99 or 7.0 doesn't put a backdoor in some specific condition?

There's no static or dynamic analysis deployed to enhance this level of trust.

The initial attempts are simulated execution like in valgrind, all the sanitizer work, perhaps difference on the functional level beyond the text of the source code where it's too easy to smuggle things through... (Like on an abstracted conditional graph.)

We cannot even compare binaries or executables right given differing compiler revisions.

replies(4): >>43616446 #>>43616959 #>>43617254 #>>43618041 #
neuroelectron ◴[] No.43616446[source]
Besides full source boostrapping which could adopt progressive verification of hardware features and assumption of untrusted hardware, integration of Formal Verification into the lowest levels of boostrapping is a must. Bootstap security with the compiler.

This won't protect against more complex attacks like RoP or unverified state. For that we need to implement simple artifacts that are verifiable and mapped. Return to more simple return states (pass/error). Do error handling external to the compiled binaries. Automate state mapping and combine with targeted fuzzing. Systemd is a perfect example of this kind of thing, what not to do: internal logs and error states being handled by a web of interdependent systems.

replies(1): >>43616594 #
AstralStorm ◴[] No.43616594[source]
RoP and unverified state would at least be highlighted by such an analysis. Generally it's a lot of work and we cannot quite trust fully automated systems to keyword it to us... Especially when some optimizer changes between versions of the compiler. Even a single compile flag can throw the abstract language upside down, much less the execution graph...

Fuzzing is good but probabilistic. It is unlikely to hit on a deliberate backdoor. Solid for finding bugs though.

replies(1): >>43616968 #
1. lrvick ◴[] No.43616968[source]
I agree here. Use automated tools to find low hanging fruit, or mistakes.

There is unfortunately no substitute for a coordinated effort to document review by capable security researchers on our toolchain sources.