←back to thread

596 points pimterry | 10 comments | | HN request time: 0.492s | source | bottom
Show context
sam0x17 ◴[] No.36862821[source]
But signing necessarily is happening on the user's device... what is to stop brave/etc from also signing their outgoing requests with the same key your local Chrome install is using? On a mobile device I can see how this would work but how would this ever work on (non-apple) PCs without exposing the key to anyone willing to poke around a bit?
replies(2): >>36863017 #>>36864099 #
1. ReactiveJelly ◴[] No.36863017[source]
I think the idea is, there is a chain of trust from a TPM (So you don't have access to the private key, ever) through the bootloader, OS kernel, Windows Update, and vendor-blessed web browser, to the server.

So Brave would fail when Windows says, "hm, your hash doesn't match any recent Edge version, so you don't get to issue a key signing request to the TPM."

Or it will allow the request but when it arrives at the server as "Windows, non-Edge browser" they'll hit you with the endless CAPTCHAs or just boot you out as a hacker.

It's not the web I grew up in.

replies(1): >>36863116 #
2. sam0x17 ◴[] No.36863116[source]
Right but how does edge prove itself to the TPM, what's to stop [insert alt browser here] from performing the exact same actions [insert blessed browser here] performs when it interacts with the TPM. It could even emulate a legitimate browser internally for the sake of argument, but it seems like anything could just pretend to be a blessed browser. Sure, you can hash binaries, but you can just as easily mess with their memory space at runtime after the fact so to the TPM (or whatever system checks the hash) the binary checks out because all the modifications are side-loaded after the binary runs.

It seems to me like you can only guarantee no tampering in an actually locked down system, like modern mobile devices.

replies(4): >>36863197 #>>36863550 #>>36863712 #>>36863811 #
3. jsnell ◴[] No.36863197[source]
The browser doesn't interface directly with any of the hardware, the operating system does. And the integrity of the operating system can be attested to by the hardware via a chain of trust all the way to the secure bootloader.
replies(1): >>36863505 #
4. sam0x17 ◴[] No.36863505{3}[source]
Yeah but what's to stop me from spawning a hidden instance of edge, sending keys etc to it to get it to visit some page, and using either window sub-classing (to hack it's memory space and read the request directly) or a local proxy server to steal the attestation it generates before terminating the request?

Likewise what's to stop you from patching the operating system directly (ok secure boot)

You could also just emulate an entire windows OS + TPM and have the emulator do it it sounds like

Like any scenario where I'm allowed to run arbitrary code within the OS with administrator privileges sounds like you could escape this.

replies(1): >>36863786 #
5. alex7734 ◴[] No.36863550[source]
> but you can just as easily mess with their memory space at runtime after the fact

You can only do that because Windows lets you do that. That's something that can change.

> It seems to me like you can only guarantee no tampering in an actually locked down system, like modern mobile devices.

Yes, the whole point of remote attestation is to be able to prove to the other party that your device is running an approved and fully locked down OS+browser combo before it sends you any content.

It does this by putting the code that creates this guarantee in the only place that you can't (easily) change: in the silicon of your CPU.

6. mike_hearn ◴[] No.36863712[source]
Apple's attestation infrastructure works for macOS so being mobile isn't required.

What is required:

1. Code signing. The operating system must be able to link files together such that it knows they come from one version of one app, and the app must be code signed so there's a precise way to state "this token was issued to Microsoft Edge v123".

2. Debugger API protections. A program must be able to opt out of ptrace and similar APIs. macOS offers this via the "get-task-allow" entitlement.

3. Program file anti-tampering. The OS must stop one app fiddling with the files of another. macOS does this since Ventura (or rather, apps must have the "app management" permission to do so, and the OS can detect when it's been done).

4. Signing-aware IPC. The OS must allow two processes to connect to one another across privilege levels, such that each side is aware of the signing identity of the other.

5. An attestation service. The OS must offer an RPC server that, given challenge bytes, generates and signs a data structure containing the challenge along with the code signing identity, and enough information to link the attestation key to a secure root of trust.

6. The root of trust. Usually a secure processor that's integrated with the motherboard and firmware. It "measures" the boot process and can deterministically derive keys that are only accessible in certain configurations.

7. All the above must either be protected from administrator access, or elevating to admin/fiddling with any of the components must alter the perceived configuration of the system so it can be detected remotely. On Macs this is done via SIP which de-privileges root, and if you disable SIP so you can modify OS files, then the Secure Element won't give you the same attestation as a normal device.

The key to all of this is that you don't have to lock down the device to do this. Firstly, you can allow arbitrary changes as long as they get measured and honestly reported. Apple let you disable SIP and then you can hack macOS to your heart's content, you just can't pretend to Apple that you didn't do it. Secondly, it's only a very small part of the OS that has to be measured. Basically the bits that enforce address space isolation and secret protection, so, the kernel, the boot process and a few userland IPC servers.

But for example if you install other apps, or customize your OS configuration in most ways, then that's just irrelevant for the purposes of identifying what app is running.

Now, you can build extra and more restrictive rules on top of that. For example once you establish that the TLS session key is owned by a protected app, and that the app is Chrome or Edge or Safari, you can then ask it to answer honestly whether there are certain extensions installed or whether the browser is being automated or whatever else there's a protocol for. But the core infrastructure doesn't know anything about that, its job is over once the core attestation of identity+protection is done.

Windows is (as usual) behind in this tech. The pieces are there but nothing really lines up and nothing is using it, for example, Windows has a notion of package identity and app tamperproofing but Edge doesn't use it. I don't know of any way to opt-out of code injection or debugger APIs on that platform either. But macOS has all the pieces and it hangs together.

Interestingly, so does Linux! There are configurations that can match what macOS does there, and which can also drive the TPM to remotely attest. What's missing is any organizational or community will to audit distributions and figure out which ones implement the criteria above. But if someone were to do that, you could issue TLS-style certificates to the OS that lets third parties reason about the identity of apps on the remote machine.

7. alex7734 ◴[] No.36863786{4}[source]
> You could also just emulate an entire windows OS + TPM and have the emulator do it it sounds like

Yes, but your emulated TPM is not on the approved list. To impersonate an approved TPM you would need to pull the keys from a real TPM which requires (probably very expensive) semiconductor lab tools and trashing the chip.

replies(2): >>36867133 #>>36868790 #
8. flangola7 ◴[] No.36863811[source]
The TPM gathers various data about the system, including if any user process is running with access permissions that could tamper with memory space. It trusts the OS and drivers to do this because the entire stack is cryptographically verified from boot onwards. If the environment is one where an app could be spoofed, this will be included in the attestation request and the attest will fail.

You might be able to get around it by finding a zero day in the Windows kernel, but as soon as Microsoft discovers and patches it their attest server will stop providing attestations for devices until they install the OS update and reboot to reestablish a trust chain.

9. sam0x17 ◴[] No.36867133{5}[source]
Such an evil pattern. We need to eliminate this at all costs.

Luckily I think if Chrome were to move forward with this they'd face extreme anti-trust stuff as a result

10. hellojesus ◴[] No.36868790{5}[source]
If you did trash the chip whilr managing to successfully pull the tpm keys, could you then use that key to sign requests in an unapproved vm or on metal with a different root tpm?