To create an analogy, my car doesn't have bullet proof glass, someone could easily shoot it up and i'd be dead. But nobody really goes around shooting up cars, so is it an issue?
To create an analogy, my car doesn't have bullet proof glass, someone could easily shoot it up and i'd be dead. But nobody really goes around shooting up cars, so is it an issue?
Those platform issues may not be a problem for Jane Doe on Windows 10, but when users decide that they need more security than that (and Qubes points in the right direction, although there's still some miles to go) they may have a reason (or just paranoia).
In either case, they won't be very happy with the sad state that is x86 "security" because there are way too many places where an undue trust into Intel is implied.
Eg. the SGX feature, which can run userland code in a way that even the kernel (or SMM) can't read it: The keys are likely mediated by the Management Engine (ME) - which also comes with network access and a huge operating system (for the purposes of an embedded system: the smallest version is 2MB) that you, the user, can't get rid of.
So who's SGX protecting you from if you fear involvement by nation state actors? x86 isn't for you in that case (Intel's version in particular, but pretty much all alternatives are just as bad) - and that's what this paper points out.
This already suggests the owner of the CPU isn't who they are protecting, but it gets worse (even before we consider the risk from AMT). Starting an SGX enclave seems to require[3] a "launch key" that is only known by Intel, allowing Intel to control what software is allowed to be protected by SGX.
[1] https://software.intel.com/en-us/blogs/2013/09/26/protecting...
[2] Before the term "DRM" was coined, the same crap used to be called "trusted computing" (back when Microsoft was pushing Palladium/NGSCB)
[3] https://jbeekman.nl/blog/2015/10/intel-has-full-control-over...
If I could provide all the keys my machine could be completely locked down and damn near impossible to break into even with complete physical access and a ECE degree.
That would require all hardware to be secure against all attackers. As soon as one attacker breaks one hardware model, they can start extracting and selling private keys that allow anyone to emulate that piece of hardware in software.
I'm also having a hard time seeing the use case. What kind of thing has hard secrecy requirements but demands so much hardware that you can't justify owning it?
The "Protected A/V Path" could be a neat feature for high security computers (consider the GPU driver, a horrible, buggy, complex piece of software, being unable to grab pixels of a security-labelled window) - but that's not what this was built for. SGX, the same.
Non-DRM use cases seem to be an afterthought, if possible at all (typically not).