←back to thread

Lindroid

(twitter.com)
262 points LorenDB | 6 comments | | HN request time: 0s | source | bottom
Show context
fock ◴[] No.40714592[source]
"needs root and patches to AOSP". So there go the banking apps mentioned elsewhere and you can just use postmarketOS.

Still cool though!

replies(4): >>40714796 #>>40715232 #>>40715542 #>>40718579 #
1. calgoo ◴[] No.40715542[source]
As it’s Linux could we run android in a vm and simulate a safe device? That’s my hope for the future of mobile devices, safe VMs that we can run on top of the spyware (government enforced stuff too) infested phones.
replies(1): >>40715901 #
2. franga2000 ◴[] No.40715901[source]
Unfortunately, most of those misguided "device integrity" checkers detect VMs and the best of them (luckily still not used very often) are essentially unbeatable (unless there's a critical bug) due to hardware-backed attestation.
replies(2): >>40718084 #>>40719471 #
3. gigel82 ◴[] No.40718084[source]
*worst
4. phh ◴[] No.40719471[source]
> essentially unbeatable (unless there's a critical bug) due to hardware-backed attestation.

FWIW Google started enforcing those attestations like one month or two ago, and there are many critical bugs. I haven't kept scores, but some other people did : https://x.com/wanghan1995315/status/1803063996204912873

And please note that they only list big brands leaks. Since you can use any OEM's attestation key, /any/ OEM leak can break those so-called "security protections". Even after all security flaws, there is still social engineering. I guesstimate that you could ask an ODM's engineer for an attestation key for like 1k$ and share it to like 20 persons. (200 would probably still remain under the radar, but you need to be capable of keeping a secret with 200 persons)

Though the conclusion shouldn't be that attestation keys are insecure and we need a secure variant (because a secure variant is indeed coming). The conclusion must be that users own the device they bought. Not Google, not Apple.

replies(1): >>40720284 #
5. mschuster91 ◴[] No.40720284{3}[source]
> And please note that they only list big brands leaks. Since you can use any OEM's attestation key, /any/ OEM leak can break those so-called "security protections".

Inevitably though, the price of these will rise, the most capable eyes on the planet will have a few very thorough looks at all the TPM chip firmware they can get their hands on, and eventually platforms will be so secure and the price will be so high the only ones left to have them are three-letter agencies (if even these).

Anti tamper measures have their place - I'd really love to have a device that cannot have a persistent backdoor implanted - but the very second the state of the anti-tamper measure becomes visible to user-level applications, they become an arms race between Big Money (=DRM rightsholders and big game studios) and my freedom.

replies(1): >>40720489 #
6. mindslight ◴[] No.40720489{4}[source]
> I'd really love to have a device that cannot have a persistent backdoor implanted - but the very second the state of the anti-tamper measure becomes visible to user-level applications, they become an arms race between Big Money (=DRM rightsholders and big game studios) and my freedom.

The two can be reconciled by not having any privileged keys baked in by the manufacturer. It's only the manufacturers keeping records of the baked in attestation/signing key(s) that allows for remote attestation to be scaled up into treacherous computing. Otherwise if device owners could generate/load new attestation/signing keys and have them be indistinguishable from any original ones, then that same process can be emulated. This would likely require legislation to reign in manufacturers' desires to retain backdoors, but the point is that it is possible from a technical perspective.