←back to thread

656 points EthanHeilman | 10 comments | | HN request time: 3.289s | source | bottom
Show context
staticassertion ◴[] No.30102061[source]
This is pretty incredible. These aren't just good practices, they're the fairly bleeding edge best practices.

1. No more SMS and TOTP. FIDO2 tokens only.

2. No more unencrypted network traffic - including DNS, which is such a recent development and they're mandating it. Incredible.

3. Context aware authorization. So not just "can this user access this?" but attestation about device state! That's extremely cutting edge - almost no one does that today.

My hope is that this makes things more accessible. We do all of this today at my company, except where we can't - for example, a lot of our vendors don't offer FIDO2 2FA or webauthn, so we're stuck with TOTP.

replies(15): >>30103088 #>>30103131 #>>30103846 #>>30104022 #>>30104121 #>>30104716 #>>30104840 #>>30105344 #>>30106941 #>>30107798 #>>30108481 #>>30108567 #>>30108916 #>>30111757 #>>30112413 #
c0l0 ◴[] No.30104121[source]
I think 3. is very harmful for actual, real-world use of Free Software. If only specific builds of software that are on a vendor-sanctioned allowlist, governed by the signature of a "trusted" party to grant them entry to said list, can meaningfully access networked services, all those who compile their own artifacts (even from completely identical source code) will be excluded from accessing that remote side/service.

Banks and media corporations are doing it today by requiring a vendor-sanctioned Android build/firmware image, attested and allowlisted by Google's SafetyNet (https://developers.google.com/android/reference/com/google/a...), and it will only get worse from here.

Remote attestation really is killing practical software freedom.

replies(16): >>30104148 #>>30104166 #>>30104241 #>>30104603 #>>30105136 #>>30106352 #>>30106792 #>>30107048 #>>30107250 #>>30107515 #>>30108070 #>>30108409 #>>30108716 #>>30108754 #>>30109550 #>>30123243 #
1. seibelj ◴[] No.30104148[source]
Reproducible builds are a thing, I don't know how widespread they are. I know the monero project has that built in so everyone compiles the exact same executable regardless of environment, and can verify the hash against the official version https://github.com/monero-project/monero
replies(3): >>30104553 #>>30104740 #>>30107844 #
2. nybble41 ◴[] No.30104553[source]
Reproducible builds allow the user of the software to verify the version that they are using or installing. They do not, by themselves, allow the sort of remote attestation which would permit a service to verify the context for authentication—the user, or a malicious actor, could simply modify the device to lie about the software being run.

Secure attestation about device state requires something akin to Secure Boot (with a TPM), and in the context of a BYOD environment precludes the device owner having full control of their own hardware. Obviously this is not an issue if the organization only permits access to its services from devices it owns, but no organization should have that level of control over devices owned by employees, vendors, customers, or anyone else who requires access to the organization's services.

replies(1): >>30105074 #
3. c0l0 ◴[] No.30104740[source]
Let me elaborate on the problem I do have with remote attestation, no matter if I can verify that the signed binary is identical with something I can build on my own.

I use LineageOS on my phone, and do not have Google Play Services installed. The phone only meaningfully interacts with a very few and most basic Google services, like an HTTP server for captive portal detection on Wifi networks, an NTP server for setting the clock, etc. All other "high-level" services that I am aware of, like Mail, Calendaring, Contacts, Phone, Instant Messaging, etc., are either provided by other parties that I feel more comfortable with, or that I actually host myself.

Now let's assume that I would want or have to do online/mobile banking on my phone - that will generally only work with the proprietary app my bank provides me with. Even if I choose to install their unmodified APK, (any lack of) SafetyNet will not attest my LineageOS-powered phone as "kosher" (or "safe and secure", or "healthy", or whatever Google prefers calling it these days), and might refuse to work. As a consequence, I'm effectively unable to interact via the remote service provided by my bank, because they believe they've got to protect me from the OS/firmware build that I personally chose to use.

Sure, "just access their website via the browser, and do your banking on their website instead!", you might say, and you'd be right for now. But with remote attestation broadly available, what prevents anyone from also using that for the browser app on my phone, esp. since browser security is deemed so critical these days? I happen to use Firefox from F-Droid, and I doubt any hypothetical future SafetyNet attestation routine will have it pass with the same flying colors that Google's own Chrome from the Play Store would. I'm also certain that "Honest c0l0's Own Build of Firefox for Android" wouldn't get the SafetyNet seal of approval either, and with that I'd be effectively shut off from interacting with my bank account from my mobile phone altogether. The only option I'd have is to revert back to a "trusted", "healthy" phone with a manufacturer-provided bootloader, firmware image, and the mandatory selection of factory-installed, non-removable crapware that I am never going to use and/or (personally) trust that's probably exfiltrating my personal data to some unknown third parties, sanctified by some few hundreds of pages of EULA and "Privacy" Policy.

With app stores on all mainstream and commercially successful desktop OSes, the recent Windows 11 "security and safety"-related "advances" Microsoft introduced by (as of today, apparently still mildly) requiring TPM support, and supplying manufacturers with "secure enclave"-style add-on chips of their own design ("Pluton", see https://www.techradar.com/news/microsofts-new-security-chip-...), I can see this happening to desktop computing as well. Then I can probably still compile all the software I want on my admittedly fringe GNU/Linux system (or let the Debian project compile it for me), but it won't matter much - because any interaction with the "real" part of the world online that isn't made by and for software freedom enthusiasts/zealots will refuse to interact with the non-allowlisted software builds on my machine.

It's going to be the future NoTCPA et al. used to combat in the early 00s, and I really do dread it.

replies(1): >>30105566 #
4. InitialLastName ◴[] No.30105074[source]
> no organization should have that level of control over devices owned by employees, vendors, customers, or anyone else who requires access to the organization's services.

It seems like the sensible rule of thumb is: If your organization needs that level of control, it's on your organization to provide the device.

replies(1): >>30106523 #
5. kelnos ◴[] No.30105566[source]
I hadn't thought about extending this attestation to the browser build as a way to lock down web banking access. That's truly scary, as my desktop Linux build of Firefox might not qualify, if this sort of thing would come to pass.
6. jacobr1 ◴[] No.30106523{3}[source]
Or we could better adopt secure/confidential computing enclaves. This would allow the organization to have control over the silo'd apps and validate some degree of security (code tampering, memory encryption, etc) but not need to trust that other apps on the device or even the OS weren't compromised.
replies(2): >>30107746 #>>30108200 #
7. wizzwizz4 ◴[] No.30107746{4}[source]
I'm uncomfortable letting organisations have control over the software that runs on my hardware. (Or, really, any hardware I'm compelled to use.)

Suppose the course I've been studying for the past three years now uses $VideoService, but $VideoService uses remote attestation and gates the videos behind a retinal scan, ten distinct fingerprints, the last year's GPS history and the entire contents of my hard drive?¹ If I could spoof the traffic to $VideoService, I could get the video anyway, but every request is signed by the secure enclave. (I can't get the video off somebody else, because it uses the webcam to identify when a camera-like object is pointed at the screen. They can't bypass that, because of the remote attestation.)

If I don't have ten fingers, and I'm required to scan ten fingerprints to continue, and I can't send fake data because my computer has betrayed me, what recourse is there?

¹: exaggeration; no real-world company has quite these requirements, to my knowledge

replies(1): >>30118137 #
8. chaxor ◴[] No.30107844[source]
Wow, the monero project looks like they have some great ideas. I like this reproducible build - may try to get my team to work towards that. It seems like monero has more of a focus on use as a real currency, so hopefully it isn't drawing in the speculative people and maintains it's real use.
9. nybble41 ◴[] No.30108200{4}[source]
Secure enclaves are still dependent on someone other than the owner (usually the manufacturer) having ultimate control over the device. Otherwise the relying party has no reason to believe that the enclave is secure.
10. jacobr1 ◴[] No.30118137{5}[source]
So there are two levels of tradeoffs:

1) The requirements themselves. These are different for consumer vs employee type scenarios. So general, I'd prefer we err on the side of DRM free for things like media, but there are legitimate concerns around things like data privacy when you are an employee of an organization handling sensitive data.

2) Presuming there are legitimate reasons to have strong validation of the user and untampered software, we have the choice of A) using only organization supplied hardware in those case or B) using your own with some kind of restriction. I'd much prefer to use my own as much as possible ... if I can be ensured that it won't spy on me, or limit what I can do, for the non-organization specific purposes I've explicitly opted-in to enable.

> I'm uncomfortable letting organisations have control over the software that runs on my hardware.

I'm not, if we can sandbox. I'm fine with organizations running javascript in my browser for instance. Or running mobile apps that can access certain data with explicit permissions (like granting access to my photos so that I can share them in-app). I think we can do better with both more granular permissions, better UX, and cryptographic guarantees to both the user and the organization that both the computation and data is operating at the agreed level.