Such keys should be in the hands of users, not Intel.
Such keys should be in the hands of users, not Intel.
I agree that master private keys are bad security design, and we can and should do better. I'm just not willing to say that all past security value is retroactively nullified. That feels polemic more than realistic.
Security is about tradeoffs, most notably security vs convenience, but also many others.
Anyone who suggests that their personal preferences in tradeoffs are not just universally correct but also the only reasonable position to hold is just silly.
Real but temporary security -> This 2048 bit key you generated will be commercial grade protection until at least 2030. Sometime after that computers will be strong enough to brute force it. Do not store anything with this key that will still be highly sensitive in 7 years. It's possible the underlying algorithm is cracked, or a leap in quantum computers happen that will make the key obsolete sooner.
Security theater -> All software running on this chip must be signed with our master key. Please trust all software we sign with this key, and no malicious party will have access to it. You are not allowed to run arbitrary software on your hardware because it is not signed with our key.
In the first case, the security is real. You own the lock, you own the key, and you control the entire security process. In the second case, you neither own the lock, the key, and basically have limited access to your own hardware.
Part of the blame, imo, lies with how clunky tools are at the lower levels. I've seen plenty of hardware based signing protocols that don't allow for key hierarchies.
Higher level tools push this along as well. Hashicorp Vault also, last I checked, doesn't allow for being a front end to an HSM. You can store the master unlock key for a Vault in an HSM, but all of the keys Vault works with will still be in Vault, in memory.
IT admins are thrilled to have limited access to their own hardware, as long as adversaries do too.
In corporate IT, the greatest fear is insider attacks, either knowing or because statistically some users will inevitably make mistakes. Secure boot is fantastic in this context, even if it feels like an unreasonably impingement to gamers / tech enthusiasts.
For most people there are, in fact, no legitimate reasons to run "their own" software on "general purpose" (read: household appliance) computing hardware. Almost nobody runs custom software on their washing machine or toaster.