Most active commenters

    ←back to thread

    253 points pabs3 | 13 comments | | HN request time: 0.001s | source | bottom
    Show context
    greatgib ◴[] No.44601921[source]
    It's totally crazy that we have to go through Microsoft to sign things to be able to have our OS run on third parties computers, and that Microsoft manage to win about this so easily as it was never seriously challenged.
    replies(7): >>44601962 #>>44602085 #>>44602088 #>>44602288 #>>44602373 #>>44602674 #>>44615523 #
    1. nine_k ◴[] No.44602088[source]
    Basically every x64 computer is intended to be able to run Windows. Hence MS had to be involved, and I suppose nobody else with serious money wanted the burden.

    AFAICT you can still disable Secure Boot in most UEFI firmware, and boot anything you like (or not like, if an attacker tampers with your system).

    replies(3): >>44602233 #>>44602369 #>>44604472 #
    2. oakwhiz ◴[] No.44602233[source]
    We don't even reap the benefits of autocratic decisions from Microsoft in this area. Boards always come out with things like messed up ACPI, etc.
    replies(1): >>44602448 #
    3. blkhawk ◴[] No.44602369[source]
    Secure boot belongs to a class of security that while clearly giving a theoretical benefit in practice it falls far short of providing any benefit whatsoever at least to the user of a system. Its introduction was mostly part of a wider (probably partially defunct and failed regarding mobile x86) strategy to lock down the PC so the Microsoft store and purchased apps through it would be more secure from the end-user. Secondary was in my opinion better security for handheld phones and tablets running x86 but there the "App store" aspect is even more clear.

    "attacker tampers with your system" does not happen at least in the way you think it does or it does not protect you against meaningful attack at all.

    replies(2): >>44602686 #>>44603806 #
    4. p_l ◴[] No.44602448[source]
    Boards' ACPI etc. is still better than what it would be without "certified for Windows" (whatever name of the hour is) programs
    5. pdimitar ◴[] No.44602686[source]
    What kinds of attacks was Secure Boot designed to mitigate? Is it the evil maid attack? Or an accidentally ran with `sudo` program can indeed screw your entire boot process and inject rootkits etc.? Or is it something else?
    replies(3): >>44602757 #>>44603596 #>>44603741 #
    6. jeroenhd ◴[] No.44602757{3}[source]
    Evil maid and rootkits, mostly. It's also part of the trust chain that unlocks an encrypted disk without having to enter a password.

    On Windows, secure boot has worked pretty well when it comes to rootkits. MBR rootkits were trivial to write, but UEFI rootkits require UEFI firmware changes or exploiting the bootloader process itself, both of which are much more complex. If malware uses the Linux shim, the TPM will notice and refuse to provide the Bitlocker key, so your computer won't boot without going to the IT office and asking for the recovery key (which should prompt more investigation).

    replies(1): >>44602945 #
    7. blkhawk ◴[] No.44602945{4}[source]
    That is sorta the rub - the treat profile "evil maid" is mainly governmental actors for most people even for Orgs. Your example shows mostly how an org can secure their own devices against casual misuse by unprivileged users. This does not help against any serious attack. It only protects against stuff you don't need to worry about generally.
    8. cesarb ◴[] No.44603596{3}[source]
    > What kinds of attacks was Secure Boot designed to mitigate?

    Boot sector viruses, or their modern equivalents. Basically, anything which injects itself into the boot chain before the antivirus can start; after that point, the antivirus is supposed to be able to stop any malware. That is, they wanted to prevent malware from being able to hide from the antivirus by loading before it.

    replies(1): >>44606590 #
    9. magicalhippo ◴[] No.44603741{3}[source]
    > Or an accidentally ran with `sudo` program can indeed screw your entire boot process and inject rootkits etc.?

    The more realistic scenario would be exploiting a privilege escalation bug. Of which there have been and will be plenty of on both Windows and Linux.

    10. msgodel ◴[] No.44603806[source]
    Anything that locks you out of your own computer is at absolute best an availability failure but more often than not forces you to use compromised system software.
    11. somat ◴[] No.44604472[source]
    MS did not "Have" to be involved. The problem is that doing it right is hard, not hard as in "it was tricky to figure it out but once we did everything works" but hard as in "every single user now has an additional impossible to remember key they have to keep track of or they get locked out of their system", basically the mother of all support nightmares. so Microsoft took the easy(perhaps realistically, the only) way out. they said "we are not going to have the end user own their keys, we will own the keys"

    Honestly I wish they(where they is them that designed this whole broken system) did it it right. On first boot you would set up some keys, now you are your own trust root, and when you you want Microsoft to manage your system, perfectly reasonable, managing systems is scary, you sign their keys and add them to the store. The problem is at a low level it all sort of just works, but nobody want to design that user interface. nobody wants to write the documentation required to explain it to joe random user. Nobody wants to run the call center dealing 24/7 walking people through a complicated process, patiently getting them unstuck when they loose their keys, explaining what a trust root is and why they now have to jump through hoops to set one up.

    I like to believe that had they done it right initially, the ui would have been molded into something that just works and the client base would also get molded into expecting these key generations steps. But I am also an optimist, so perhaps not and it is exactly as scary and thankless a task as I described above. But we will never know, Microsoft took the easy way out, said we will hold the keys. And now you are a serf on your own machine. Theoretically there is a method to install your own keys, and it may even work, but the process is awkward(never really being meant for mass use) and you are dependent on the vendor to care enough to enable it. Many don't.

    replies(1): >>44637418 #
    12. danudey ◴[] No.44606590{4}[source]
    Another example: a custom kernel build or kernel module that backdoors your system or steals data at the kernel level. Secure boot provides the opportunity for a chain of trust that goes from the firmware manufacturer all the way down to your individual kernel modules.

    The firmware validates the shim. The shim validates the boot loader. The boot loader validates the kernel. The kernel validates the kernel modules.

    Once you have that chain of trust, you can also add in other factors; encrypt your disk using a key, seal the key in the TPM, and lock that key behind validation of the firmware and the boot loader. Your system boots, those different components are measured into the PCRs, and if the combination is correct the key is released and your disk can be decrypted automatically. Now if someone boots your system using a different firmware or boot loader, the TPM won't release the key, and your disk can't be decrypted except by someone with the passphrase, recovery key, etc.

    Without secure boot, you can't trust that any of those components aren't reporting falsified measurements to the PCRs, lying to the TPM, and getting access to the key to decyrpt your disk despite booting from a compromised USB drive. That, of course, means you can just encrypt your disk using only a passphrase that you manually enter, but for a lot of users (sadly) that's too complex and they'll choose not to use disk encryption at all.

    Case in point, TouchID and FaceID are seen as alternatives to using a PIN or passphrase to unlock your iPhone, but they're actually meant as alternatives to not locking your phone at all - a way to make device security transparent enough that everyone will use it. Without a secure chain of trust from the firmware to the kernel, that's not really an option.

    13. tiberious726 ◴[] No.44637418[source]
    Eh, that's basically what we have now with boards where you can delete the MS keys and enroll your own. Just with different defaults and no support nightmare