Most active commenters
  • mschuster91(3)

←back to thread

596 points pimterry | 11 comments | | HN request time: 1.358s | source | bottom
Show context
toyg ◴[] No.36863175[source]
This might be where the internet really gets forked, as it's been predicted over and over since the '90s.

On one side, we'll have a "clean", authority-sanctioned "corpweb", where everyone is ID'ed to the wazoo; on the other, a more casual "greynet" galaxy of porn and decentralized communities will likely emerge, once all tinkerers get pushed out of corpnet. It could be an interesting opportunity to reboot a few long-lost dreams.

replies(16): >>36863389 #>>36863444 #>>36863448 #>>36863559 #>>36863564 #>>36863569 #>>36863656 #>>36863710 #>>36863719 #>>36863948 #>>36864147 #>>36865104 #>>36865427 #>>36865627 #>>36866079 #>>36871323 #
1. TheNewsIsHere ◴[] No.36863656[source]
The problem I have with this web attestation concept generally is that I really want it _inside_ my shiny SSO-everywhere-Zero-Trust-at-the-edge-mTLS-everywhere business network.

I also kind of want it in the public-cloud-meets-private-use home environment (that is, my Cloudflare Access tunnels and MS365 business tenant I use for private stuff).

I don’t want it to touch my personal browsing experience or in any way involved in my personal-use browser environments.

These are effectively opposed desires at this point, and it’s a cat-out-of-the-bag technology.

replies(3): >>36863783 #>>36865952 #>>36873614 #
2. MayeulC ◴[] No.36863783[source]
Are you sure you don't just want client certs?

I can also imagine an IPv7 with ephemeral addresses based on private keys (like on yggdrasil), and a way for the browser to remember keys if wanted by the user. Authenticate sessions with the "IP address".

replies(1): >>36863901 #
3. mschuster91 ◴[] No.36863901[source]
Client certs don't guarantee that there is not a rootkit running in kernel space sniffing session tokens, other credentials or data in general from user-space memory.

Attestation does a reasonably well job at that, as you now need a kernel or bootloader exploit.

replies(1): >>36864609 #
4. saurik ◴[] No.36864609{3}[source]
A "rootkit in kernel space" already requires a kernel exploit, unless what you are really up in arms about is a lack of a verified boot chain (which absolutely does not require remote attestation).
replies(1): >>36865228 #
5. mschuster91 ◴[] No.36865228{4}[source]
> A "rootkit in kernel space" already requires a kernel exploit

On desktop? Nope, which is the point. Placing a piece of malware is easy without a kernel exploit. On standard Linux distributions that do not use dm-verity and friends, local root is enough - modify the kernel image or initrd in /boot, and you can do whatever you want with very few ways for a system administrator to detect it upon the next boot. The challenge more is getting local root in the first place, especially as a lot of systems now use selinux or at least have daemons drop privileges.

Windows is a bit harder since Windows refuses to load unsigned drivers since the Win7 x64 days (x86 IIRC didn't mandate the checks), but that's not as much of a hurdle as most think - just look at the boatload of cases where someone managed to steal (or acquire legitimately) a signing certificate to ship malware. Getting local root here is probably the easiest of all three OSes IMO, given the absurd amount of update helpers and other bloatware that got caught allowing privilege escalation regularly.

The hardest IMO/E is macOS, where you have to manually boot to recovery to deactivate SIP and they've been phasing out kexts pretty much already, and you get a crapton of very strong warnings if you mess around with them - you have to manually load them.

With attestation and code-signing done right, it's all but impossible to get your code running in kernel space on Linux and macOS without a kernel exploit, the achilles heel will always be who gets signing certificates that allow loading a module.

replies(1): >>36866281 #
6. mindslight ◴[] No.36865952[source]
These desires are not mutually opposed!

The fundamental problem with current remote attestation schemes is the corporate-owned attestation key baked in at the factory [0]. This allows the manufacturer to create a known class of attestation keys that correspond to their physical devices, which is what prevents a user from just generating mock attestations when needed.

If manufacturers were prohibited from creating these privileged keys [1], then the uniform-corporate-control attestation fears would mostly vanish, while your use cases would remain.

A business looking to secure employee devices could record the attestation key of each laptop in their fleet. Cloud host auditors could do the same thing to all their hardware. Whereas arbitrary banks couldn't demand that your hardware betray what software you're running, since they'd have no way of tying the attestation key to a known instance of hardware.

(The intuition here is similar to secure boot, and what is required for good user-empowering secure boot versus evil corporate-empowering secure boot. Because they're roughly duals.)

[0] actually it's something like a chained corporate signing key that signs any attestation key generated on the hardware, but same effect.

[1] or if the user could import/export any on-chip attestation keys via a suitable maintenance mode. Exporting would need a significant delay of sitting in maintenance mode to protect against evil maid attacks and the like.

replies(1): >>36866870 #
7. saurik ◴[] No.36866281{5}[source]
As I said: "unless what you are really up in arms about is a lack of a verified boot chain (which absolutely does not require remote attestation)". None of what you are talking about requires the attestation piece, only the verified boot chain (which supports codesign through to whatever layer you wish to protect).

The goal of remote attestation is only to be able to prove to a third party that your device is "secured", which does not benefit the user in any way other than awkward/indirect stuff like where in the Google proposal they argue that users have a "need" to prove to a website that they saw an ad (to get free content).

replies(1): >>36867420 #
8. TheNewsIsHere ◴[] No.36866870[source]
I agree with you, but I’m not sure making it taboo/criminal/a regulatory violation/prohibited/etc for device manufacturers to embed keys at manufacturing and enabling the resulting attestation capabilities is the right move either.

If I’m Apple, or Google, or Samsung, then I have a genuine interest in device attestation in my own ecosystem for various good reasons. Apple makes extensive use of this capability in servicing, for example. That makes sense to me.

That’s what I mean by a cat-out-of-the-bag technology. Threat actors, counterfeits, and exploits being what they are in this era, it’s almost an inevitability that these capabilities become a sort of generalized device hygiene check. Device manufacturers don’t have to provide these APIs of course, or allow the use of their device attestation mechanisms, but they’d be pressured to by industry anyway. And then we would have something else.

I do like your idea of having the platform bring keys to the table and requiring some kind of admin privileged action to make them useful. But I wonder if we had started that way with web attestation, would it inevitably turn into this anyway?

replies(1): >>36867336 #
9. mindslight ◴[] No.36867336{3}[source]
There are always genuine interests for various good reasons. The problem is that the limitless logic of software creates a power dynamic of all or nothing. Situations are comprised of multiple parties, and one party's "good reasons" ends up creating terrible results for the other parties. For example, Apple's attestation on hardware they produced now becomes a method to deny you the ability to replace part of your phone with an aftermarket part, or to unjustly deny warranty service for an unrelated problem.

So no, I do not buy the argument that we should just let manufacturers implement increasingly invasive privileged backdoors into the hardware they make, as if its inevitable. With the mass production economics of electronics manufacturing, the end result of that road can only be extreme centralization, where a handful of companies outright control effectively all computing devices. If we want to live in a free society, this must not be allowed to happen!

> But I wonder if we had started that way with web attestation, would it inevitably turn into this anyway?

The main threat with web attestation is that a significant number of devices/customers/visitors are presumed to have the capability, so a company can assert that all users must have this capability, forgoing only a small amount of business (similar how they've done with snake oil 2FA and VOIP phone numbers, CAPTCHAs for browsing from less-trackable IPs, etc). So creating some friction such that most devices don't by default come with the capability to betray their users would likely be enough to prevent the dynamic from taking off.

But ultimately, the point of being able to export attestation keys from a device is so that the owner of a device can always choose to forgo hardware attestation and perform mock attestations in their place, regardless of having been coerced into enrolling their device into an attestation scheme.

10. mschuster91 ◴[] No.36867420{6}[source]
Verified boot chains are one thing, but say you're a bank and you wish to reduce the rate of people falling victim to malware that uses kernel-level privileges to snoop out credentials. The user benefits (at least from your perspective as the bank) from being less impacted by fraud as the banking website will no longer even let the user enter their credentials.

Either you build a massive database of "known good" combinations of hardware, OS, kernel modules versions and corresponding TPM checksums, or you leave that job to a third party - and that is what remote attestation is at its core. Apple has it the easiest there, they control everything in the entire path, while Google has to deal with a myriad of device manufacturers.

Note I massively dislike the path that more and more applications take to restrict user freedom, but I do see why corporations find it appealing.

11. hakfoo ◴[] No.36873614[source]
This is a "have your cake and eat it" problem.

You can make devices around being unbreachable and self-attesting. Go build a SBC and sink in a block of epoxy.

But they also want the appeal of the open, hackable world-- cheap kit that's advancing quickly, commodity technology and infrastructure.

I am actually sort of disappointed we never ended up with a world of special-purpose sealed devices-- put a proper payment terminal on everyone's desk instead of trusting nobody slapped a keylogger into your browser while you're typing card numbers, for example.