Most active commenters
  • themacguffinman(9)
  • dathinab(8)
  • kelnos(5)
  • freedomben(5)
  • (4)
  • nijave(4)
  • nybble41(3)
  • pishpash(3)
  • 0xedd(3)
  • Dylan16807(3)

←back to thread

656 points EthanHeilman | 124 comments | | HN request time: 0.853s | source | bottom
Show context
staticassertion ◴[] No.30102061[source]
This is pretty incredible. These aren't just good practices, they're the fairly bleeding edge best practices.

1. No more SMS and TOTP. FIDO2 tokens only.

2. No more unencrypted network traffic - including DNS, which is such a recent development and they're mandating it. Incredible.

3. Context aware authorization. So not just "can this user access this?" but attestation about device state! That's extremely cutting edge - almost no one does that today.

My hope is that this makes things more accessible. We do all of this today at my company, except where we can't - for example, a lot of our vendors don't offer FIDO2 2FA or webauthn, so we're stuck with TOTP.

replies(15): >>30103088 #>>30103131 #>>30103846 #>>30104022 #>>30104121 #>>30104716 #>>30104840 #>>30105344 #>>30106941 #>>30107798 #>>30108481 #>>30108567 #>>30108916 #>>30111757 #>>30112413 #
1. c0l0 ◴[] No.30104121[source]
I think 3. is very harmful for actual, real-world use of Free Software. If only specific builds of software that are on a vendor-sanctioned allowlist, governed by the signature of a "trusted" party to grant them entry to said list, can meaningfully access networked services, all those who compile their own artifacts (even from completely identical source code) will be excluded from accessing that remote side/service.

Banks and media corporations are doing it today by requiring a vendor-sanctioned Android build/firmware image, attested and allowlisted by Google's SafetyNet (https://developers.google.com/android/reference/com/google/a...), and it will only get worse from here.

Remote attestation really is killing practical software freedom.

replies(16): >>30104148 #>>30104166 #>>30104241 #>>30104603 #>>30105136 #>>30106352 #>>30106792 #>>30107048 #>>30107250 #>>30107515 #>>30108070 #>>30108409 #>>30108716 #>>30108754 #>>30109550 #>>30123243 #
2. seibelj ◴[] No.30104148[source]
Reproducible builds are a thing, I don't know how widespread they are. I know the monero project has that built in so everyone compiles the exact same executable regardless of environment, and can verify the hash against the official version https://github.com/monero-project/monero
replies(3): >>30104553 #>>30104740 #>>30107844 #
3. Signez ◴[] No.30104166[source]
Let's note that this very concerning problem is only one if organizations take an allowlist approach to this "context aware authorization" requirement.

Detecting changes — and enforcing escalation in that case — can be enough, e.g. "You always uses Safari on macOS to connect to this restricted service, but now you are using Edge on Windows? Weird. Let's send an email to a relevant person / ask for a MFA confirmation or whatever."

replies(4): >>30104681 #>>30107064 #>>30107180 #>>30108251 #
4. reginaldo ◴[] No.30104241[source]
It depends on the level of attestation required. A simple client certificate should suffice for the majority of the non-DoD applications.
replies(1): >>30105519 #
5. nybble41 ◴[] No.30104553[source]
Reproducible builds allow the user of the software to verify the version that they are using or installing. They do not, by themselves, allow the sort of remote attestation which would permit a service to verify the context for authentication—the user, or a malicious actor, could simply modify the device to lie about the software being run.

Secure attestation about device state requires something akin to Secure Boot (with a TPM), and in the context of a BYOD environment precludes the device owner having full control of their own hardware. Obviously this is not an issue if the organization only permits access to its services from devices it owns, but no organization should have that level of control over devices owned by employees, vendors, customers, or anyone else who requires access to the organization's services.

replies(1): >>30105074 #
6. shadowgovt ◴[] No.30104603[source]
> If only specific builds of software that are on a vendor-sanctioned allowlist

Yes, but for government software this is a bog-standard approach. Not even "the source code is publicly viewable to everyone" is sufficient scrutiny to pass government security muster; specific code is what gets cleared, and modifications to that code must also be cleared.

7. ◴[] No.30104681[source]
8. c0l0 ◴[] No.30104740[source]
Let me elaborate on the problem I do have with remote attestation, no matter if I can verify that the signed binary is identical with something I can build on my own.

I use LineageOS on my phone, and do not have Google Play Services installed. The phone only meaningfully interacts with a very few and most basic Google services, like an HTTP server for captive portal detection on Wifi networks, an NTP server for setting the clock, etc. All other "high-level" services that I am aware of, like Mail, Calendaring, Contacts, Phone, Instant Messaging, etc., are either provided by other parties that I feel more comfortable with, or that I actually host myself.

Now let's assume that I would want or have to do online/mobile banking on my phone - that will generally only work with the proprietary app my bank provides me with. Even if I choose to install their unmodified APK, (any lack of) SafetyNet will not attest my LineageOS-powered phone as "kosher" (or "safe and secure", or "healthy", or whatever Google prefers calling it these days), and might refuse to work. As a consequence, I'm effectively unable to interact via the remote service provided by my bank, because they believe they've got to protect me from the OS/firmware build that I personally chose to use.

Sure, "just access their website via the browser, and do your banking on their website instead!", you might say, and you'd be right for now. But with remote attestation broadly available, what prevents anyone from also using that for the browser app on my phone, esp. since browser security is deemed so critical these days? I happen to use Firefox from F-Droid, and I doubt any hypothetical future SafetyNet attestation routine will have it pass with the same flying colors that Google's own Chrome from the Play Store would. I'm also certain that "Honest c0l0's Own Build of Firefox for Android" wouldn't get the SafetyNet seal of approval either, and with that I'd be effectively shut off from interacting with my bank account from my mobile phone altogether. The only option I'd have is to revert back to a "trusted", "healthy" phone with a manufacturer-provided bootloader, firmware image, and the mandatory selection of factory-installed, non-removable crapware that I am never going to use and/or (personally) trust that's probably exfiltrating my personal data to some unknown third parties, sanctified by some few hundreds of pages of EULA and "Privacy" Policy.

With app stores on all mainstream and commercially successful desktop OSes, the recent Windows 11 "security and safety"-related "advances" Microsoft introduced by (as of today, apparently still mildly) requiring TPM support, and supplying manufacturers with "secure enclave"-style add-on chips of their own design ("Pluton", see https://www.techradar.com/news/microsofts-new-security-chip-...), I can see this happening to desktop computing as well. Then I can probably still compile all the software I want on my admittedly fringe GNU/Linux system (or let the Debian project compile it for me), but it won't matter much - because any interaction with the "real" part of the world online that isn't made by and for software freedom enthusiasts/zealots will refuse to interact with the non-allowlisted software builds on my machine.

It's going to be the future NoTCPA et al. used to combat in the early 00s, and I really do dread it.

replies(1): >>30105566 #
9. InitialLastName ◴[] No.30105074{3}[source]
> no organization should have that level of control over devices owned by employees, vendors, customers, or anyone else who requires access to the organization's services.

It seems like the sensible rule of thumb is: If your organization needs that level of control, it's on your organization to provide the device.

replies(1): >>30106523 #
10. tablespoon ◴[] No.30105136[source]
>> 3. Context aware authorization. So not just "can this user access this?" but attestation about device state! That's extremely cutting edge - almost no one does that today.

> I think 3. is very harmful for actual, real-world use of Free Software. If only specific builds of software that are on a vendor-sanctioned allowlist, governed by the signature of a "trusted" party to grant them entry to said list, can meaningfully access networked services, all those who compile their own artifacts (even from completely identical source code) will be excluded from accessing that remote side/service.

Is that really a problem? In practice wouldn't it just mean you can only use employer-provided and certified devices? If they want to provide their employees some Free Software-based client system, that configuration would be on the whitelist.

replies(3): >>30106237 #>>30107608 #>>30113041 #
11. kelnos ◴[] No.30105519[source]
It "should" suffice, but entities like banks and media companies are already going beyond this. As the parent points out, many financial and media apps on Android will just simply not work if the OS build is not signed by a manufacturer on Google's list. Build your own Android ROM (or even use a build of one of the popular alternative ROMs) and you lose access to all those apps.
replies(3): >>30105961 #>>30107238 #>>30110457 #
12. kelnos ◴[] No.30105566{3}[source]
I hadn't thought about extending this attestation to the browser build as a way to lock down web banking access. That's truly scary, as my desktop Linux build of Firefox might not qualify, if this sort of thing would come to pass.
13. ethbr0 ◴[] No.30105961{3}[source]
The clearer way to put this is: when faced with a regulatory requirement, most of the market will choose whatever pre-packaged solution most easily satisfies the requirement.

In the case of client attestation, this is how we get "Let Google/Apple/Microsoft handle that, and use what they produce."

And as a end state, leads to a world where large, for-profit companies provide the only whitelisted solutions, because they're the largest user bases and offer a turn-key feature, and the market doesn't want to do addition custom work to support alternatives.

replies(1): >>30108879 #
14. shbooms ◴[] No.30106237[source]
I think from the viewpoint of a business/enterprise environment, yes you're right, context-aware authorization is a good thing.

But I think the point of your parent comment's reply was that the inevitable adoption of this same techonology in the consumer-level environment is a bad thing. Among other things, it will allow big tech companies to have an stronger grip on what software/platforms are OK to use/not use.

If your employer forces you to, say, only use a certain version of Windows as your OS in order to do your job, that's generally acceptable to most people.

But if your TV streaming provider tells you have to use a certain version of Windows to consume their product, that's not considered acceptable to a good deal of people.

replies(5): >>30106924 #>>30109468 #>>30109782 #>>30109940 #>>30116202 #
15. wyldfire ◴[] No.30106352[source]
> I think 3. is very harmful for actual, real-world use of Free Software.

It has been a long, slow but steady march in this direction for a while [1]. Eventually we will also bind all network traffic to the individual human(s) responsible. 'Unlicensed' computers will be relics of the past.

[1] https://boingboing.net/2012/01/10/lockdown.html

replies(2): >>30107818 #>>30112604 #
16. jacobr1 ◴[] No.30106523{4}[source]
Or we could better adopt secure/confidential computing enclaves. This would allow the organization to have control over the silo'd apps and validate some degree of security (code tampering, memory encryption, etc) but not need to trust that other apps on the device or even the OS weren't compromised.
replies(2): >>30107746 #>>30108200 #
17. nonameiguess ◴[] No.30106792[source]
In practice, the DoD right now uses something called AppGate, which downloads a script on-demand to check for device compliance, and it supports free software distributions, but the script isn't super sophisticated and relies heavily on being able to detect the OS flavor and assumes you're using the blessed package manager, so right now it only works for Debian and RedHat descended Linux flavors. It basically just goes down a checklist of STIG guidelines where they are practical to actually check, and doesn't go anywhere near the level of expecting you to have a signed bootloader and a TPM or checking that all of the binaries on your device have been signed.
18. btbuilder ◴[] No.30106924{3}[source]
I think browser-based streaming is the only scenario impacted. Apps can already interrogate their platform and make play/no play decisions.

They are also already limiting (weakly) the max number of devices that can playback which requires some level of device identification, just not at the confidence required for authentication.

replies(2): >>30107126 #>>30109211 #
19. alksjdalkj ◴[] No.30107048[source]
Totally locking down a computer to just a pre-approved set of software is a huge step towards securing it from the kind of attackers most individuals, companies, and governments are concerned with. Sacrificing "software freedom" for that kind of security is a trade off that the vast majority of users will be willing to make - and I think the free software community will need to come to terms with that fact at some point and figure out what they want to do about it.
replies(3): >>30107533 #>>30107566 #>>30109316 #
20. dathinab ◴[] No.30107064[source]
If something like that is good enough to fulfill the requirements, that would be good.

Some services already thinks like that, like I think discord.

21. dathinab ◴[] No.30107126{4}[source]
Well, the fact that I can't do credit card payments for some banks if I don't have an iphone or non rooted, google android phone is a problem which already exists.

Worse supposedly this is for security, but attackers which pulled of a privilege escalation tend to have enough ways to make sure that non of this detection finds them.

In the end it just makes sure you can't mess with your own credit card 2FA process by not allowing you to control the device you own.

replies(3): >>30107589 #>>30108288 #>>30109271 #
22. hansvm ◴[] No.30107180[source]
Somebody made the front page here a few days ago because they were locked out of Google with no recourse from precisely that kind of check.
replies(3): >>30107937 #>>30108092 #>>30108533 #
23. bigiain ◴[] No.30107238{3}[source]
I’m not even so sure I’m totally against banks doing that either.

From where I sit right now, I have within arms reach my MacBook, a Win11 Thinkpad, a half a dozen Raspberry Pis (including a 400), 2 iPhones only one of which is rooted, an iPad (unrooted) a Pinebook, a Pine Phone, and 4 Samsung phones one with its stock Android7 EOLed final update and three rooted/jailbroken with various Lineage versions. I have way way more devices running open source OSen than unmolested Apple/Microsoft/Google(+Samsung) provided Software.

My unrooted iPhone is the only one of them I trust to have my banking app/creds on.

I’d be a bit pissed if Netflix took my money but didn’t run where I wanted it, but they might be already, I only ever really use it on my AppleTV and my iPad. I expect I’d be able to use it on my MacBook and thinkpad, but could be disappointed, I’d be a bit surprised if it ran on any of my other devices listed…

replies(3): >>30108219 #>>30109286 #>>30156528 #
24. reilly3000 ◴[] No.30107250[source]
How could a build be verified to be the same code without some kind of signature? You cant just validate a SHA, that could be faked from a client.

If you want to get a package that is in the Arch core/ repo, doesnt that require a form of attestation?

I just don’t see a slippery slope towards dropping support for unofficial clients, we’re already at the bottom where they are generally and actively rejected for various reasons.

Still, the Android case is admittedly disturbing, it feels a lot more personal to be forced to use certain OS builds; that goes beyond the scope of how I would define a client.

replies(2): >>30108324 #>>30111229 #
25. lupire ◴[] No.30107515[source]
"Software freedom" doesn't really make sense when the software's function is "using someone else's software". You're stil at the mercy of the server (which is why remote attestation is even interesting in the first place).

If you want to use free software, only commect to Affero GPL servces and don't use nonfree services, and don't consume nonfree content.

replies(1): >>30107575 #
26. lupire ◴[] No.30107533[source]
Free software doesn't really work in a networked untrusted world.
replies(2): >>30109008 #>>30110476 #
27. pdonis ◴[] No.30107566[source]
> Totally locking down a computer to just a pre-approved set of software is a huge step towards securing it from the kind of attackers most individuals, companies, and governments are concerned with.

No, it isn't. It's a way for corporations and governments to restrict what people can do with their devices. That makes sense if you're an employee of the corporation or the government, since organizations can reasonably expect to restrict what their employees can do with devices they use for work, and I would be fine with using a separate device for my work than for my personal computing (in fact that's what I do now). But many scenarios are not like that: for example, me connecting with my bank's website. It's not reasonable or realistic to expect that to be limited to a limited set of pre-approved software.

The correct way to deal with untrusted software on the client is to just...not trust the software on the client. Which means you need to verify the user by some means that does not require trusting the software on the client. That is perfectly in line with the "zero trust" model advocated by this memo.

28. wizzwizz4 ◴[] No.30107575[source]
You're usually only at the mercy of the server because you don't control the client; a free YouTube client like VLC remains useful. A free Microsoft Teams client would be useful. I allege that free VNC clients are also useful, even if there's non-free software on the other end.
29. ryukafalz ◴[] No.30107589{5}[source]
This should be obvious from your comment but I think it's worth calling something out explicitly here: a bank that does that is mandating that you accept either Apple's or Google's terms of service. That's a lot of power to give to two huge companies.

I think we'd do well to provide the option to use open protocols when possible, to avoid further entrenching the Apple/Google duopoly.

replies(3): >>30110127 #>>30113234 #>>30115822 #
30. pdonis ◴[] No.30107608[source]
> Is that really a problem? In practice wouldn't it just mean you can only use employer-provided and certified devices?

That's fine for employees doing work for their employers. It's not fine for personal computing on personal devices that have to be able to communicate with a wide variety of other computers belonging to a wide variety of others, ranging from organizations like banks to other individuals.

31. wizzwizz4 ◴[] No.30107746{5}[source]
I'm uncomfortable letting organisations have control over the software that runs on my hardware. (Or, really, any hardware I'm compelled to use.)

Suppose the course I've been studying for the past three years now uses $VideoService, but $VideoService uses remote attestation and gates the videos behind a retinal scan, ten distinct fingerprints, the last year's GPS history and the entire contents of my hard drive?¹ If I could spoof the traffic to $VideoService, I could get the video anyway, but every request is signed by the secure enclave. (I can't get the video off somebody else, because it uses the webcam to identify when a camera-like object is pointed at the screen. They can't bypass that, because of the remote attestation.)

If I don't have ten fingers, and I'm required to scan ten fingerprints to continue, and I can't send fake data because my computer has betrayed me, what recourse is there?

¹: exaggeration; no real-world company has quite these requirements, to my knowledge

replies(1): >>30118137 #
32. enriquto ◴[] No.30107818[source]
The dystopia described in Stallman's "The right to read" is almost here... and we don't even get to colonize the solar system.
33. chaxor ◴[] No.30107844[source]
Wow, the monero project looks like they have some great ideas. I like this reproducible build - may try to get my team to work towards that. It seems like monero has more of a focus on use as a real currency, so hopefully it isn't drawing in the speculative people and maintains it's real use.
34. freedomben ◴[] No.30107937{3}[source]
It wasn't I, but this has been an absolute plague on an organization I work with. There are only 3 people, and we all have need to access some accounts but they are personal accounts. Also, the boss travels a lot, often to international destinations. Every time he flies I can almost guarantee we'll face some new nightmare. The worst is "we noticed something is a tiny bit different with you but we won't tell you what it is. We've emailed you a code to the email account that you are also locked out of because something is a tiny bit different with you. Also we're putting a flag on your account so it raises holy hell the next 36 times you log in."
replies(2): >>30109128 #>>30110642 #
35. no_time ◴[] No.30108070[source]
>Remote attestation really is killing practical software freedom.

Which will continue marching forward without pro-user legislation. Which is extraordinarly unlikely to happen since the government has vested interest in this development.

replies(1): >>30109391 #
36. dpatterbee ◴[] No.30108092{3}[source]
I feel like the issue with the post you mention was the absence of recourse rather than the locking out itself.
replies(2): >>30109070 #>>30129759 #
37. nybble41 ◴[] No.30108200{5}[source]
Secure enclaves are still dependent on someone other than the owner (usually the manufacturer) having ultimate control over the device. Otherwise the relying party has no reason to believe that the enclave is secure.
38. mindslight ◴[] No.30108219{4}[source]
Putting a banking app on your pocket surveillance device is one of the least secure things you can do. What happens if you're mugged, forced to login to your account, and then based on your balance it escalates to a kidnapping or class resentment beatdown? Furthermore, what happens if the muggers force you to transfer money and your bank refuses to roll back as unauthorized because their snake oil systems show that everything was "secure" ?
39. pishpash ◴[] No.30108251[source]
Who gets to decide what changes are kosher? Sounds like bureaucratic behavior modeling.
replies(1): >>30113265 #
40. pishpash ◴[] No.30108288{5}[source]
It does seem like a privilege escalation in the reverse direction, allowing banks to escalate a decision about security into one about devices. They should not have that power, and it's far from the only solution.
41. pishpash ◴[] No.30108324[source]
Where we "already" are is not a state to be anchored on.
replies(1): >>30110262 #
42. lmeyerov ◴[] No.30108409[source]
Yep

I'd feel 100% differently about this stuff if the NSA or some other cybersecurity gov arm making these rules used their massive cybersecurity budgets to provide free MFA, TLS, encrypted DNS, etc., whether US gov hosted or via non-profit (?) partners like LetsEncrypt.

OSS & free software otherwise has a huge vendor tax to actually get used. As is, this feels like economic insecurity & anti-competition via continued centralization to a small number of megavendors. Rules like this should come with money, and not to primes & incumbents, but utility providers.

Sure, our team is internally investing in building out a lot of this stuff, but we have security devs & experience, while the long tail of software folks use doesn't. The gov sets aside so much $$$$ for the perpetual cyber war going on, but not for simple universal basics here :(

43. tata71 ◴[] No.30108533{3}[source]
This is the difference between

"Log in by tapping Yes on my phone"

and

actually using a FIDO2 USB key.

44. notatoad ◴[] No.30108716[source]
If you can argue that remote attestation doesn't provide additional security, then i'd love to hear that argument. but it seems like a fairly clear-cut case that it does provide additional security, and i don't think it's reasonable to accept a lower level of security for the sake of allowing unverified builds of open-source software.

there are specific contexts where you want to distribute information as widely as possible, and in those contexts it makes sense to allow any software versions to access the information. but for contexts where security is important, that means verifying the client software isn't compromised.

replies(2): >>30108867 #>>30110431 #
45. pabs3 ◴[] No.30108754[source]
Identical source code when recompiled should produce identical binaries:

https://reproducible-builds.org/

Agreed that people should have the freedom to modify their software though.

replies(1): >>30111538 #
46. curmudgeon81 ◴[] No.30108867[source]
A server rate limiting login attempts is additional security.

A remote system asked to promise it's what it says it is: the illusion of security.

Jailbreaking, DRM, etc are all evidence of this illusion.

47. withinboredom ◴[] No.30108879{4}[source]
In my experience, it isn’t that companies don’t want to put in the work, it’s that some middle manager made a decision. I’ve been told to “implement login with X” more than once in my career and when asked what about Y or Z, they say, “we only want X” with no further explanation.
48. tomrod ◴[] No.30109008{3}[source]
Why?
49. joe-collins ◴[] No.30109070{4}[source]
Absolutely. The escalation chain is its own attack vector, but it should exist.
50. sciurus ◴[] No.30109128{4}[source]
Three people sharing a personal account, with one of them frequently traveling internationally, is such an unusual usage pattern that I'd be really disappointed with a service provider if they _didn't_ flag it for extra verification.
replies(2): >>30109250 #>>30110474 #
51. 0xedd ◴[] No.30109211{4}[source]
Cars come with AndroidAuto (and whatever is for iOS). Only apps signed by Google can communicate with AndroidAuto. I don't want to use a Google phone or app to display OSM on my car's media screen. Why is this legal?
replies(1): >>30109637 #
52. freedomben ◴[] No.30109250{5}[source]
The frustrating thing to me is that as a user they don't give us any tools to help ourselves. I would gladly make it a "team" account and login individually if we could. I would gladly do a shared TOTP, or whitelist login locations, or anything like that. Or at least give us the option to accept the risk and disable whatever anomaly detection they are applying. But no, that's not how the software world works anymore. Extreme paternalism mode is the only option as a user.
replies(2): >>30110477 #>>30110907 #
53. themacguffinman ◴[] No.30109271{5}[source]
> but attackers which pulled of a privilege escalation tend to have enough ways to make sure that non of this detection finds them

The point of these restrictions is to ensure that your device isn't unusually vulnerable to privilege escalation in the first place. If you let them, some users will root their phone, disable all protections, install an malware-filled Fortnite apk from a random website then stick their credit card company with the bill for fraud when their user-mangled system fails to secure their secrets.

You want to mod the shit out of your Android phone? Go ahead. Just don't expect other companies to deal with your shit, they're not obligated to deal with whatever insecure garbage you turn your phone into.

replies(3): >>30112794 #>>30118449 #>>30118786 #
54. 0xedd ◴[] No.30109316[source]
Wrong. 80% of attacks are social engineering ones. In which an employee is convinced to make a bank transfer, open some document, install some program. From there, often times it's exploiting wide spread software commonly found in large organizations.

Everything you said cannot be further from the truth.

replies(1): >>30109423 #
55. 0xedd ◴[] No.30109391[source]
*in the US.

Luckily, it's a dwindling power and Europe fights and penalizes large organizations breaching market "morals".

56. themacguffinman ◴[] No.30109423{3}[source]
Hence the pre-approved software restrictions. In a locked down system, even the most gullible employee won't have the authorization to "install some program".

I'd also hope that businesses care about more than 80% of attacks, preferably they should care about 100% of attacks. Hence, pre-approved software restrictions.

replies(1): >>30110215 #
57. mschuster91 ◴[] No.30109468{3}[source]
> But I think the point of your parent comment's reply was that the inevitable adoption of this same techonology in the consumer-level environment is a bad thing.

And this has happened before, with Intel ME that was and still is useful if you have a fleet of servers to manage but a hell of a security hole outside of corporate world.

And now that Windows 11 all but requires a working TPM to install (although there are ways to bypass it for now), I would not be surprised if Netflix and the rest of the content MAFIAA would follow their Android approach and demand that the user have Secure Boot enabled, only Microsoft-certified kernel drivers loaded and the decryption running in an OS-secured sandbox that even a Local Administrator-level account can access.

58. staticassertion ◴[] No.30109550[source]
I wish I had responded earlier, because now this entire thread is full of nonsense and I can't really respond to everything.

But attestation can mean a lot of things and isn't inherently in conflict with free software. For example, at my company we validate that laptops follow our corporate policy, which includes a default-deny app installation policy. Free software would only, in theory, need a digital signature so that we could add that to our allowlist.

replies(1): >>30119995 #
59. chipotle_coyote ◴[] No.30109637{5}[source]
Once you're talking about interactive information displays in cars that can be accessed while the vehicle is in motion, traffic and highway safety regulations start cropping up. When you ask "Why is this legal," try rephrasing it to, "Why is it legal for companies to make it so difficult to play Doom on my BMW's touch screen," and you will probably arrive at the answer.
replies(1): >>30112597 #
60. FredPret ◴[] No.30109782{3}[source]
Your employer pays you, but you pay the TV provider. Hence what is acceptable is very different
61. missblit ◴[] No.30109940{3}[source]
TV streaming services already do as much of this nonsense as they can get away with. The more "secure" of an environment their DRM runs in, the higher resolution the image they'll let you see.

It's just subtle enough (e.g. lower definition but will still play) and most people use "secure" enough setups that only techies, media gurus, or that one guy who's still using a VGA monitor connection end up noticing

62. lotsofpulp ◴[] No.30110127{6}[source]
That is a job for the government. They should have made electronic payments and electronic accounts for everyone a utility many years ago.
replies(1): >>30112781 #
63. ensan ◴[] No.30110215{4}[source]
Wrong again.

The computers in any sizable business already have the pre-approved restrictions set on the OS level. Employers can’t just install any software.

replies(1): >>30110843 #
64. reilly3000 ◴[] No.30110262{3}[source]
That’s fair. Its just to say there is a lot of context for client verification in software. Competitive multiplayer gaming has become an arms races of exploits and invasive anti-cheat measures; there is no concept of bring-your-own-client when there is money on the line.

Valve has taken a less heavy-handed approach and let users have more freedom over their client and UI, but they also have a massive bot problem in titles like TF2.

I can’t connect to my work network from a random client, and it will throw flags and eventually block me if I connect with an out-of-date OS version.

I can’t present any piece of paper with my banking data and a signature on it and expect other parties to accept it. I have to present it on an authorized document.

I guess money may be the common denominator here.

65. nijave ◴[] No.30110431[source]
It can go pretty terribly sideways just like antivirus with poorly coded, proprietary, privileged agents running on end user devices collecting data.

I worked at a place that only allowed "verified" software before and it's an ongoing battle to keep that list updated. Things like digital signatures can be pretty reliable but if you're version pinning you can make it extremely difficult to quickly adopt patched versions when a vulnerability comes out.

66. nijave ◴[] No.30110457{3}[source]
For something like LineageOS, ironically, the solution is to root your device to adjust build properties so it looks signed.

My vanilla LineageOS install fails but I can root with Magisk, enable Zygisk to inject code into Android, edit build properties, add SafetyNet fix and now my device is good to go?

It's crazy to think the workaround is "enable arbitrary code injection" (Zygisk)

replies(2): >>30113995 #>>30156501 #
67. kortilla ◴[] No.30110474{5}[source]
This is the problem with this kind of thing. It just perfectly captures the privilege/shelter of the programmers who come up with these heuristics of “obviously unusual”.

You just described the usage pattern of a pilot with a family, a truck driver, a seaman, etc.

It’s only unusual if your definition of usual is “relatively rich, computer power user”.

replies(2): >>30110788 #>>30112951 #
68. nijave ◴[] No.30110476{3}[source]
Linux and BSD beg to differ
69. throwaway48375 ◴[] No.30110477{6}[source]
Why do you need to all access the same account though? Can't you grant access to whatever resource you need to multiple accounts?
replies(1): >>30117791 #
70. pagnol ◴[] No.30110642{4}[source]
Have you considered using the same Proxy or VPN? I work remotely and sometimes access services through a VPN based in the country my coworkers are at specifically to avoid this kind of annoyance.
replies(1): >>30117761 #
71. seized ◴[] No.30110788{6}[source]
Not really. What's the use case there, everyone sharing a Google account?

I travelled a lot for work, and never had issues with account access. Nor did my wife ever have issues related to accounts. We don't share Google accounts though. It sounds like that user has personal accounts being used by three people for business use... Which isn't "A seaman and his family".

replies(2): >>30111432 #>>30112126 #
72. xxpor ◴[] No.30110843{5}[source]
That's not true at any big dev shop
73. blackrobot ◴[] No.30110907{6}[source]
Why don't you share a TOTP between all of you? Just take a screenshot of the authenticator QR code, or save it to a shared 1password secret.

Google's login protection mechanisms seem to be satisfied by TOTP usage, and you won't be locked out anymore (or at least much less likely to be).

replies(1): >>30122262 #
74. worthless-trash ◴[] No.30111229[source]
> How could a build be verified to be the same code without some kind of signature? You cant just validate a SHA, that could be faked from a client.

This depends on how far down the rabbit hole you want to go, if it was secureboot, only signed processes can run, would that make you feel better ? If it doesn't.. what would ?

75. iszomer ◴[] No.30111432{7}[source]
It may be true about 10 years ago but I imagine the heuristics to have been improved since; I remember being locked out of my Google account while abroad for a month and all it took was to log back in within my "country of origin".

Or a more recent example: my father forgot to bring his Android phone back abroad which subsequently locked him out of his account/services; had to wipe it for him to get his access back.

76. thinkmassive ◴[] No.30111538[source]
A remote cryptographically-signed attestation is not reproducible

https://developer.android.com/training/safetynet/attestation

replies(3): >>30112204 #>>30123320 #>>30123508 #
77. kortilla ◴[] No.30112126{7}[source]
> What's the use case there, everyone sharing a Google account?

Yes. Everyone having their own distinct accounts is a property of high computer literacy in the family.

Many of my older extended family members have a single email account shared by a husband and wife. Or in one case the way to email my aunt is to send an email to an account operated by a daughter in a different town. Aunt and daughter are both signed in so the daughter can help with attachments or “emails that go missing”, etc.

> Which isn't "A seaman and his family".

The seaman in this scenario has a smartphone with the email signed in. It’s also signed in on the family computer at home. Both the wife and him send email from it. Maybe a kid does to from a tablet. This isn’t that difficult.

replies(1): >>30113039 #
78. ◴[] No.30112204{3}[source]
79. HPsquared ◴[] No.30112597{6}[source]
Also, "why is it illegal to sell cars that can play Doom while driving"
80. ◴[] No.30112604[source]
81. franga2000 ◴[] No.30112781{7}[source]
This 10000x! A bank account is sure as hell more of a utility than a landline!

You need a bank account to do basically anything and yet consumer banking is largely unregulated (in the consumer relation sense, they are regulated on the economic side of course). Payments take upwards of 24h and only during work hours (?!?), there are no "easy switch" rewuirements, mobile apps use shit like SafetyNet and I've had banks legit tell me "just buy a phone from this list of manufacturers"... PSD2 is trash that only covers B2B interoperability and mandates a security method that has been known as broken since its invention (SMS 2FA).

82. Dylan16807 ◴[] No.30112794{6}[source]
That might at least be half a reasonable argument if they didn't all allow desktop logins that could be stuffed with malware.

> they're not obligated to deal with whatever insecure garbage you turn your phone into

Banks probably should be obligated to let you connect over standard protocols.

replies(1): >>30119819 #
83. forgotmypw17 ◴[] No.30112951{6}[source]
Most services assume your own device.

If you don't own your own device and rely on third-party devices to access the service, good luck to you...

84. darkwater ◴[] No.30113039{8}[source]
> Many of my older extended family members have a single email account shared by a husband and wife. Or in one case the way to email my aunt is to send an email to an account operated by a daughter in a different town. Aunt and daughter are both signed in so the daughter can help with attachments or “emails that go missing”, etc.

As usual with the "personas" scenarios, people creates their unrealistic scenario (just like when talking about UX or design). These personas you are describing will probably fall back to low-tech methods in most of the cases, they won't fail to take a plane because GMail locked them out due to unusual activity when they are trying to show the ticket QR in the airport. They will just print it (or have someone print it for them) beforehand.

> The seaman in this scenario has a smartphone with the email signed in. It’s also signed in on the family computer at home. Both the wife and him send email from it. Maybe a kid does to from a tablet. This isn’t that difficult.

You just missed to add that they use their shared email to communicate between them by using the "Sent" folder. To be more realistic, the seaman right after buying his Android phone will create without realizing a new Google account because he doesn't probably know that he could use the email account he is already using at home. But, enough with made-up examples to prove our own points.

replies(1): >>30113389 #
85. michaelt ◴[] No.30113041[source]
> Is that really a problem? In practice wouldn't it just mean you can only use employer-provided and certified devices?

Depends what you think big corporations' centrally managed IT equipment is like.

Theoretically, it could mean you get precisely the right tools to do your job, with the ideal maintenance and configuration provided effortlessly.

But for some organisations, it means mandatory Internet Explorer and Flash for compatibility with some decrepit intranet, crapware like McAfee that slow the system to a crawl, baffling policies like not letting you use an adblocker, and regular slow-to-install but unavoidable updates that always happen just as you're giving that big presentation.

86. floatboth ◴[] No.30113234{6}[source]
What bank doesn't have a regular web app?
replies(2): >>30113527 #>>30114794 #
87. pas ◴[] No.30113265{3}[source]
The cheapest vendor that was selected, obviously.
88. d110af5ccf ◴[] No.30113389{9}[source]
This is amazing. He just spelled out for you in great detail the sort of problems that arise in practice in the real world every day and you dismissed them out of hand as being unrealistic. I think you are far more sheltered and far less experienced than you realize. This sort of attitude is exactly what leads to these sorts of things becoming problems in the first place!

> They will just print it (or have someone print it for them) beforehand.

Yes, they will do that precisely because they do not trust technology to work for them because it frequently does not! I have family members like this. I log in to their accounts on my devices for various reasons. Even worse, I run Linux. We run in to these problems frequently. Spend time helping technically illiterate people with things. While doing so, make a concerted effort to understand why they say or think some of the things that they do.

Edit to add, I find it amusing that you make fun of his seaman example. Almost that exact scenario (in terms of number of devices, shared devices, and locations) is currently the case for two of my relatives. Two! And yet you ridicule it.

89. duckmysick ◴[] No.30113527{7}[source]
In the future banks may start accepting connections only from a handful of approved browsers. Similarly to how 4k steaming on Netflix is not available on all browsers.
90. ece ◴[] No.30113995{4}[source]
This, or we could have dual booting that's relatively as easy to do on mobile as it is on PCs.

Currently, you'd have to do find an unlocked phone, hope there is a downloadable factory image, re-flash, re-lock, re-install to run whatever needs attestation. Potentially using something like Android's DSU feature, this could all be a click or two, and you could be back running Lineage with a restart.

replies(1): >>30156509 #
91. dathinab ◴[] No.30114794{7}[source]
It's about the EU mandated 2FA auth for online shopping.

E.g. with an credit card.

Due to the way it integrates into websites (or more specifically doesn't) classical approaches like SMS 2FA (insecure anyway) but also TOTP or FIDO2 do not work.

Instead a notification is send to a preconfigured app where you then confirm it.

Furthermore as the app and payment might be on the same device the app uses the fingerprint reader/(probably some Google TPM/secrets API idk.).

Theoretically other approaches should work, but practically they tend to not work reliable or at all in most situations.

Technically web based solutions could be possible by combining a FIDO stick with browser based push notifications, practicality they (Banks) bother or there are legal anoyences.

92. jgerrish ◴[] No.30115822{6}[source]

  I think we'd do well to provide the option to use open protocols when possible.
Of course, the PR copy just writes itself, doesn't it? AD administrators, Apple and Google, banks and everyone else can benefit from context aware authorization.

If the state of your phone is stolen or "compromised", you want immediate Peace of Mind.

Even if it's just misplaced, having that kind of flexibility is just great.

93. idkyall ◴[] No.30116202{3}[source]
>But if your TV streaming provider tells you have to use a certain version of Windows to consume their product, that's not considered acceptable to a good deal of people.

This is already the case with Netflix -- 4k video content cannot be played on Linux.

94. freedomben ◴[] No.30117761{5}[source]
This is a great idea, although the boss is pretty technically challenged so getting him set up on it might be interesting. It's been extremely difficult just to teach him to use LastPass.

Much appreciate the suggestion!

replies(2): >>30119808 #>>30124804 #
95. freedomben ◴[] No.30117791{7}[source]
For some of them we can, for others no. Sadly it seems as though supporting this sort of thing is not a priority for most SaaS
96. jacobr1 ◴[] No.30118137{6}[source]
So there are two levels of tradeoffs:

1) The requirements themselves. These are different for consumer vs employee type scenarios. So general, I'd prefer we err on the side of DRM free for things like media, but there are legitimate concerns around things like data privacy when you are an employee of an organization handling sensitive data.

2) Presuming there are legitimate reasons to have strong validation of the user and untampered software, we have the choice of A) using only organization supplied hardware in those case or B) using your own with some kind of restriction. I'd much prefer to use my own as much as possible ... if I can be ensured that it won't spy on me, or limit what I can do, for the non-organization specific purposes I've explicitly opted-in to enable.

> I'm uncomfortable letting organisations have control over the software that runs on my hardware.

I'm not, if we can sandbox. I'm fine with organizations running javascript in my browser for instance. Or running mobile apps that can access certain data with explicit permissions (like granting access to my photos so that I can share them in-app). I think we can do better with both more granular permissions, better UX, and cryptographic guarantees to both the user and the organization that both the computation and data is operating at the agreed level.

97. dathinab ◴[] No.30118449{6}[source]
> privilege escalation in the first place.

it fails to do so in many ways, including not blocking old, no longer maintained, known to be vulnerable android releases

it also has little to do with moding and more with having a proper working free marked which allows alternatives besides Google and Apple

replies(1): >>30119847 #
98. dathinab ◴[] No.30118786{6}[source]
As a side note the attack scenario you describe works without needing any rooting or anything it already exists and isn't detected by their security mechanism.

Also this is about the second factor in 2FA not online banking.

Which you can do on a completely messed up computer.

I'm also not asking to be able to do pay contactless with a degoogled Android phone.

Similar I'm but asking to not have 2FA, you can use stuff like a FIDO stick with your phone.

Most of this "security" features are often about Banks pretending to have proper 2FA without a second device... (And then applying them to other apps they produce, too).

replies(1): >>30119921 #
99. _RafaelKr ◴[] No.30119808{6}[source]
I recently setup a WireGuard VPN and it was surprisingly easy (compared to other VPN solutions) and works very reliable for me.
100. themacguffinman ◴[] No.30119819{7}[source]
In practice, many credit unions/banks will only support recent versions of major desktop browsers (ie. the big three: Chrome, Firefox, Safari) which are known to mandate a good level of security. These browsers will usually have their own OS requirements. For eg Safari is tied to macOS versions directly while Chrome will drop support for older unmaintained operating systems like Windows XP.

Any system can have malware. That's not the point. To repeat my point again: client restrictions are about making sure user devices are not unusually vulnerable to malware. For example, any Windows device may be infected with malware, but if you're still running Windows XP you're vulnerable to a much larger variety of known malware and more severe exploits. Hence why businesses will want to support only modern versions of eg Chrome which itself will require modern versions of operating systems.

replies(1): >>30120681 #
101. themacguffinman ◴[] No.30119847{7}[source]
You're right, many secure apps don't go far enough in blocking Android releases that are probably too old & vulnerable. Not all apps are perfect, but blocking rooted and ancient devices is a start.
replies(1): >>30121754 #
102. themacguffinman ◴[] No.30119921{7}[source]
> As a side note the attack scenario you describe works without needing any rooting or anything it already exists and isn't detected by their security mechanism.

Android will block non-Play-Store app installations by default, and root is required for lower level access/capabilities that can bypass the normal sandbox.

I'm honestly not sure what you're saying about 2FA in the rest of your comment, it's kind of vague and there are some possible typos/grammar issues that confuse me. What exactly are you referring to when you say "pretending to have proper 2FA"?

replies(1): >>30121694 #
103. nybble41 ◴[] No.30119995[source]
> For example, at my company we validate that laptops follow our corporate policy, which includes a default-deny app installation policy.

Presumably (hopefully) these are corporate-owned devices, with a policy like that. Remote attestation is fine if it's controlled by the device's owner, and you can certainly run free software on such a device, if that particular build of the software has been "blessed" by the corporation. However, the user doesn't get the freedoms which are supposed to come with free software; in particular, they can't build and run a modified version without first obtaining someone else's approval. At the very least it suggests a certain lack of respect for your employees to lock down the tools they are required to use for their job to this extent.

104. Dylan16807 ◴[] No.30120681{8}[source]
So require I have an up to date browser on my phone. Don't require that I haven't rooted it when every desktop is in an equivalent security state. That's not enough to be "unusually vulnerable".

I'm not asking to use a 10 year old version of android that no modern browsers support any more and is missing many security features.

replies(1): >>30124187 #
105. dathinab ◴[] No.30121694{8}[source]
> installations by default

No, you basically have to click on ok once (or change a setting, depending on phone), either way it doesn't require root, and doesn't really change the attack scenario as it's based one someone intentionally installing an app from an arbitrary not-trusted source.

> root is required

Yeah, like privilege escalation attacks. As you will likely find in many compromised apps. And which on many Android phones work due to vendors not providing updates after some time. And many other reasons.

> What exactly are you referring to when you say "pretending to have proper 2FA"?

EU law says they need to provide 2FA for only banking.

Banks often don't do that for banking apps as it's inconvenient. Instead they "split the banking app in two parts" and maybe throw some finger pint based auth mechanism in and claim they have proper 2FA auth. (Because it's two app processes running and requires the fingerprint.) Through repeatedly security researchers have shown that its not a good idea.

Additionally they then require you to only use your fingerprint, not an additional password....

Either way, the point is that secure online banking doesn't requires locked down devices in general.

replies(1): >>30124291 #
106. dathinab ◴[] No.30121754{8}[source]
No, it's starting at the wrong end and not in any relevant way provide an improvement.

Checking for an too old & vulnerable is where you start.

And then you can consider to maybe also block other stuff.

There is nothing inherently less secure about an rooted device.

Sure you can make it less secure if you install bad software, but you can also make it more secure.

Or you just need to lower the minimal screen brightness for accessibility reasons.

Your claiming it's ok to take the agency from people away to decide over a major part of their live (which sadly phones are today) because maybe they could act irresponsible and do something stupid.

But if we say that is ok, then we first need to start to ban cars, because you could drive into a wall with it, and knifes, also no way to have a bath tube you could drown yourself.

And yes that is sarcastic, but there is a big difference between something being "inherently insecure" (driving without belt) or by default is in no way less secure as long as you don't go actively out of your way to make it less secure (by e.g. disabling security protections).

replies(1): >>30124441 #
107. freedomben ◴[] No.30122262{7}[source]
You're right that would totally work with Google. In our case the boss is quite computer illiterate and trying to get him to use LastPass was hard enough. He will tolerate a lot of pain from getting locked out before he'll be willing to learn TOTP :-(

And for many of the SaaS that we use, TOTP doesn't help you avoid the security lock outs.

108. rstuart4133 ◴[] No.30123243[source]
> I think 3. is very harmful for actual, real-world use of Free Software.

I hold the reverse view. The only security token I'd trust is the only thing that isn't open is the private keys the device generates when you press the reset button. The rest meaning from the CPU up (say RISC-V) and the firmware must be open to inspection by anybody. In fact, it should also be easy to peel away the silicon protection so you can see everything bar the cells storing the private keys. The other non-negotiable is the thing that computes and transmits the "measures" of the system being attested to (including it’s own firmware) can not be changed - meaning no stinking "security" patches are allowed at that level. If it's found broken, throw it away as the attestation is useless.

The attestation then becomes the device you hold is faithful rendering / compiling of open source design document X by open source compiler Y. And I can prove that myself, by doing building X using Y and verifying the end result looks like the device I hold. This process is also known as reproducible builds.

What we have now (eg, YubiKeys) is not that. Therefore I have to trust Yubi Corp. To see what that's a problem, see the title of this story. It has the words "Zero-Trust" in it.

In reality of course there is no such thing as "Zero-Trust". I will never be able to verify everything myself, ergo I have to trust something. The point is there is a world of difference between trusting an opaque black box like Yubi Corp, and trusting an open source reproducible build, where a cast of random thousands can crawl over it and say, "it seems OK to me". In reality it's not the ones that say "it seems OK" you are trusting. You are trusting the mass media (places like this in other words), to pick up and amplify the one voice among millions that says "I've found a bug - and because it's open I can prove it" so everyone hears it.

So to me it looks to be the reverse of what you say. Remote attestation won't kill software freedom. Remote attestation, done in a way that we can trust, must be built using open source. Anything less simply won’t work.

109. rstuart4133 ◴[] No.30123320{3}[source]
> A remote cryptographically-signed attestation is not reproducible

No one wants to reproduce an attestation. If you could, it could be copied, and if you can copy an attestation any hardware could send it to prove it was something else - something the other end trusts, rendering is useless for it's intended purpose.

However, the attestation is attesting the hardware you are running on is indeed "reproduced", as in it is a reliable copy of something the other end trusts. It could be a device from Yubi Key and in effect you are trusting Yubi Corp's word on the matter. Or, it could be an open source design everybody can inspect, reproducibly rendered in hardware and firmware. Personally, I think trusting the former is madness, as is trusting the latter without a reproducible build.

110. pabs3 ◴[] No.30123508{3}[source]
I don't know much about attestation, but the repro builds folks have an approach for dealing with signatures; you build once, then copy the signature into the source, so that as long as the unsigned build result is bit-identical, the signatures still match and anyone can reproduce the signed build result.

https://reproducible-builds.org/docs/embedded-signatures/

111. themacguffinman ◴[] No.30124187{9}[source]
So what if the desktop is in a worse state? Mobile is still a common threat surface that supports stronger security measures. Unusual is relative, mobile is much more secure by default. It makes no sense to weaken the security posture for mobile users just because the desktop/web doesn't allow a stronger one.

I guess you also think Android/iOS should just get rid of app permissions because users could just use similar software on their desktops without any permissions gating?

Edit: Android/iOS are increasingly popular platforms, the security they pioneer far exceeds their desktop predecessors and has improved the average security posture of millions of mobile-focused users.

replies(1): >>30124364 #
112. themacguffinman ◴[] No.30124291{9}[source]
Only on Android is it so simple to sideload, and even then there are lower level app capabilities that require root even for sideloaded apps.

Good security is layered. Just because privilege escalation attacks are sometimes possible without root doesn't mean you throw open the floodgates and ignore the threat of root. The point of banning rooted devices is that privilege escalation attacks are much easier in rooted devices.

Of course online banking doesn't require locked down devices, but online banking is more secure in locked down devices. I don't see why banks should weaken their security posture on root just because they aren't perfect in other areas.

113. Dylan16807 ◴[] No.30124364{10}[source]
> It makes no sense to weaken the security posture for mobile users just because the desktop/web doesn't allow a stronger one.

The motivation is not "just" that, or for fun, the motivation is that users should be allowed to control their own devices. And have them keep working.

> I guess you also think Android/iOS should just get rid of app permissions because users could just use similar software on their desktops without any permissions gating?

I want it to work... exactly like app permissions. Where if I root it, I can override things.

> Android/iOS are increasingly popular platforms, the security they pioneer far exceeds their desktop predecessors and has improved the average security posture of millions of mobile-focused users

Having that kind of sysadmin lockdown is useful, but if I want to be my own sysadmin I shouldn't be blacklisted by banks.

114. themacguffinman ◴[] No.30124441{9}[source]
> There is nothing inherently less secure about an rooted device.

This is clearly wrong, rooted devices are much more insecure because they enable low level access to maliciously alter the system. Malware often requires root and will first try to attempt to attain root, which of course isn't necessary if a user has manually unlocked root themselves.

> Your claiming it's ok to take the agency from people away to decide over a major part of their live (which sadly phones are today) because maybe they could act irresponsible and do something stupid.

No one is taking away any user's agency. Users are free to root their phones if they wish (many Android phones at least will allow it), but companies are also free to deny these users service. Users are free to avail themselves of any company's service on a non-rooted phone. "Not using rooted phones to access anything you like" is hardly a major loss of agency.

Phone insecurity is very dangerous IMO, much more dangerous really than bathtubs or perhaps knives. You could argue that vehicles are similarly very dangerous and I'd agree. I don't think we're very far off from locked down self-driving cars. Unfortunately we're not there yet with self-driving tech and the current utility of vehicles still outweighs their immense safety risks. You can't really say that about rooted phones. The legitimate benefits of a rooted phone are largely relevant to developers, not the average user, and most users never attempt to tinker with their phone.

replies(1): >>30125896 #
115. pagnol ◴[] No.30124804{6}[source]
There are also a number of browser extensions which may be easier to set up and use for non-technical folks, for example FoxyProxy seems to offer one. I've never tried any myself, though.
116. dathinab ◴[] No.30125896{10}[source]
You having root access doesn't any arbitrary application on your phone has root access. So no. It is not inherently less secure.

If you can't proceed with a normal life after you root you phone you are NOT free to do so but instead get punished when doing so.

replies(1): >>30142684 #
117. hansvm ◴[] No.30129759{4}[source]
That definitely exacerbates the issue, but I don't think it's fair to claim that the absence of recourse is the _only_ problem. If you have limited cell service, limited connectivity, or limited time, then the account being locked can be a significant burden that completely blocks whatever opportunity you were trying to take advantage of. Note that the response time even for newsworthy account locking events is still on the order of hours to days.
118. themacguffinman ◴[] No.30142684{11}[source]
For the last time, yes it is inherently less secure. You gain root access by disabling/weakening the OS' built-in protections against root access.

> If you can't proceed with a normal life after you root you phone you are NOT free to do so but instead get punished when doing so.

Freedom to root doesn't mean freedom from the consequences of rooting. Banking apps are hardly necessary for a normal life, and neither is rooting.

119. kelnos ◴[] No.30156501{4}[source]
Yeah, that's the crazy thing: that this entire "verification" house of cards can be so easily defeated by just faking the response to an API call from code that you can control (after unlocking your bootloader and installing your own code). I guess this is why there is a push to stop allowing bootloaders to be unlocked.
replies(1): >>30176675 #
120. kelnos ◴[] No.30156509{5}[source]
I mean... no thanks? I remember dual-booting Windows and Linux (and macOS and Linux) for years back in the 00s, and it was inconvenient and annoying. I don't want to go back to that, even (especially?) on a phone.
replies(1): >>30189186 #
121. kelnos ◴[] No.30156528{4}[source]
> I’m not even so sure I’m totally against banks doing that either.

The hole in this reasoning is that you don't need the app; you can just sign into the bank's website from the mobile browser, and get all the same functionality you'd get from the app. (Maybe you don't get a few things, like mobile check deposits, since they just don't build features like that into websites for the most part.) The experience will sometimes be worse than that of the app, but you can still do all the potentially-dangerous things without it. So why bother locking down the app when the web browser can do all the same things?

> I’d be a bit pissed if Netflix took my money but didn’t run where I wanted it

I actually canceled my HBO Max account when, during the HBO Now -> HBO Max transition, they somehow broke playback on Linux desktop browsers. When I wrote in to support, they claimed it was never supported, so they weren't obligated to care. I canceled on the spot.

122. nijave ◴[] No.30176675{5}[source]
Even locked bootloaders only help a little. Afaik all iOS devices have locked bootloaders but that doesn't stop jailbreaking. I imagine Android, with spotty vendor support track record, would be even easier
replies(1): >>30189262 #
123. ece ◴[] No.30189186{6}[source]
Dual booting isn't so bad, I've almost always had a gaming partition somewhere, while my current install doesn't even run 32-bit binaries. That said, attestation should be possible with user-locked bootloaders, not just vender-locked bootloaders. I suppose Magisk provides something close to this currently with bootloaders that can't be re-locked for custom roms, so more power to it.
124. ◴[] No.30189262{6}[source]