←back to thread

596 points pimterry | 1 comments | | HN request time: 0.218s | source
Show context
sam0x17 ◴[] No.36862821[source]
But signing necessarily is happening on the user's device... what is to stop brave/etc from also signing their outgoing requests with the same key your local Chrome install is using? On a mobile device I can see how this would work but how would this ever work on (non-apple) PCs without exposing the key to anyone willing to poke around a bit?
replies(2): >>36863017 #>>36864099 #
ReactiveJelly ◴[] No.36863017[source]
I think the idea is, there is a chain of trust from a TPM (So you don't have access to the private key, ever) through the bootloader, OS kernel, Windows Update, and vendor-blessed web browser, to the server.

So Brave would fail when Windows says, "hm, your hash doesn't match any recent Edge version, so you don't get to issue a key signing request to the TPM."

Or it will allow the request but when it arrives at the server as "Windows, non-Edge browser" they'll hit you with the endless CAPTCHAs or just boot you out as a hacker.

It's not the web I grew up in.

replies(1): >>36863116 #
sam0x17 ◴[] No.36863116[source]
Right but how does edge prove itself to the TPM, what's to stop [insert alt browser here] from performing the exact same actions [insert blessed browser here] performs when it interacts with the TPM. It could even emulate a legitimate browser internally for the sake of argument, but it seems like anything could just pretend to be a blessed browser. Sure, you can hash binaries, but you can just as easily mess with their memory space at runtime after the fact so to the TPM (or whatever system checks the hash) the binary checks out because all the modifications are side-loaded after the binary runs.

It seems to me like you can only guarantee no tampering in an actually locked down system, like modern mobile devices.

replies(4): >>36863197 #>>36863550 #>>36863712 #>>36863811 #
1. mike_hearn ◴[] No.36863712[source]
Apple's attestation infrastructure works for macOS so being mobile isn't required.

What is required:

1. Code signing. The operating system must be able to link files together such that it knows they come from one version of one app, and the app must be code signed so there's a precise way to state "this token was issued to Microsoft Edge v123".

2. Debugger API protections. A program must be able to opt out of ptrace and similar APIs. macOS offers this via the "get-task-allow" entitlement.

3. Program file anti-tampering. The OS must stop one app fiddling with the files of another. macOS does this since Ventura (or rather, apps must have the "app management" permission to do so, and the OS can detect when it's been done).

4. Signing-aware IPC. The OS must allow two processes to connect to one another across privilege levels, such that each side is aware of the signing identity of the other.

5. An attestation service. The OS must offer an RPC server that, given challenge bytes, generates and signs a data structure containing the challenge along with the code signing identity, and enough information to link the attestation key to a secure root of trust.

6. The root of trust. Usually a secure processor that's integrated with the motherboard and firmware. It "measures" the boot process and can deterministically derive keys that are only accessible in certain configurations.

7. All the above must either be protected from administrator access, or elevating to admin/fiddling with any of the components must alter the perceived configuration of the system so it can be detected remotely. On Macs this is done via SIP which de-privileges root, and if you disable SIP so you can modify OS files, then the Secure Element won't give you the same attestation as a normal device.

The key to all of this is that you don't have to lock down the device to do this. Firstly, you can allow arbitrary changes as long as they get measured and honestly reported. Apple let you disable SIP and then you can hack macOS to your heart's content, you just can't pretend to Apple that you didn't do it. Secondly, it's only a very small part of the OS that has to be measured. Basically the bits that enforce address space isolation and secret protection, so, the kernel, the boot process and a few userland IPC servers.

But for example if you install other apps, or customize your OS configuration in most ways, then that's just irrelevant for the purposes of identifying what app is running.

Now, you can build extra and more restrictive rules on top of that. For example once you establish that the TLS session key is owned by a protected app, and that the app is Chrome or Edge or Safari, you can then ask it to answer honestly whether there are certain extensions installed or whether the browser is being automated or whatever else there's a protocol for. But the core infrastructure doesn't know anything about that, its job is over once the core attestation of identity+protection is done.

Windows is (as usual) behind in this tech. The pieces are there but nothing really lines up and nothing is using it, for example, Windows has a notion of package identity and app tamperproofing but Edge doesn't use it. I don't know of any way to opt-out of code injection or debugger APIs on that platform either. But macOS has all the pieces and it hangs together.

Interestingly, so does Linux! There are configurations that can match what macOS does there, and which can also drive the TPM to remotely attest. What's missing is any organizational or community will to audit distributions and figure out which ones implement the criteria above. But if someone were to do that, you could issue TLS-style certificates to the OS that lets third parties reason about the identity of apps on the remote machine.