Most active commenters
  • rstuart4133(6)

←back to thread

432 points nobody9999 | 12 comments | | HN request time: 1.12s | source | bottom
Show context
codedokode ◴[] No.46246465[source]
In my opinion, every manufacturer of a programmable device should not be allowed to prevent the buyer from reprogramming it.
replies(8): >>46247960 #>>46248388 #>>46250102 #>>46250233 #>>46251819 #>>46252140 #>>46252929 #>>46280460 #
rstuart4133 ◴[] No.46251819[source]
I would not buy a FIDO2 token if it allowed anybody to reprogram it, including me. If you managed to make selling me such a device illegal, then may a pox descend on your house.
replies(3): >>46252137 #>>46252249 #>>46256587 #
octoberfranklin ◴[] No.46252249[source]
You're free to choose not to reprogram it, so the pox is actually upon your house.

Also, you should probably spend more time reading about cryptography and less time reading FIDO Alliance propaganda.

replies(1): >>46252426 #
1. rstuart4133 ◴[] No.46252426[source]
I'm guessing you don't understand the reason I don't want it to be reprogrammable. Yes, there are some advantages to me being able to reprogram it. But it comes with two big downsides.

The first is if I can reprogram it, then so can anyone else. I don't know what the situation is where you live, but government has passed laws allowing them to compel all manufacturers of reprogrammable devices to all them to reprogram is with their spyware.

The second is places I interact with, like banks, insist on having guarantees on the devices I use to authenticate myself. Devices like a credit card. "I promise to never reprogram this card so it debits someone else's account" simply won't fly with them.

The easy way out of that is to ensure the entity who can reprogram it has a lot of skin in the game and deep pockets. This is why they trust a locked pixel running Google signed android to store your cards. But take the same phone running a near identical OS, but on unlocked hardware so you reprogram it, and they won't let you store cards.

But that's the easy way out. It still let's a government force Google to install spyware, so it's not the most secure way. One way to make it secure is to insist no one can reprogram it. That's what a credit card does.

In any case, if someone successfully got the law changed in the way the OP suggested, so people could not use their devices as a digital passport, it won't only be me wishing a pox on their house.

replies(4): >>46255232 #>>46256599 #>>46257484 #>>46271997 #
2. greensh ◴[] No.46255232[source]
1. if your government decides google has to put spyware on your phone, you wont be able to remove it, unless your device is reprogramnable.

It's actually the other way around, the only way to garantue that your device is free of spyware is you reprogramming it. You shouldn't have to trust the potentially compromised manufacturer.

replies(1): >>46258892 #
3. codedokode ◴[] No.46256599[source]
> but government has passed laws allowing them to compel all manufacturers of reprogrammable devices to all them to reprogram is with their spyware.

In this case the government may mandate to have spyware pre-installed in the factory - which is already the case for phones and laptops in some countries.

> I promise to never reprogram this card so it debits someone else's account

When reprogramming, the card should wipe private keys so it becomes just a "blank" without any useful information.

replies(1): >>46258308 #
4. thesnide ◴[] No.46257484[source]
for such security devices, there is OTP.

I prefer to have my auth device bricked than compromised.

for anything else, i want to be able to reprogram.

so for vendors, a simple choice :

* be OTP, but no "patching"

* be R/W, but also by its owner

replies(1): >>46258978 #
5. rstuart4133 ◴[] No.46258308[source]
That doesn't work for two reasons. Firstly the law in my country specifically forbids introducing what they call a "systemic weakness". Among other things, that bans them from demanding every device is bugged. Instead they must get an judge to authorise targeting an individual, then get the manufacturers to replace the firmware in that device.

Secondly, they have no control over companies not based where I live. So I could just import it myself, provided you are successful get ever country to pass a law the denies me the right to do this the way I want to do it.

6. rstuart4133 ◴[] No.46258892[source]
True, but it's turtles all the way down. There is lots of non-reprogramable firmware in what you call "hardware". The recent article here pointed out the 8087 (an old floating point co-processor) had so much firmware (for the time) Intel had to use a special type of transistor to make it fit. Modern CPU's have many such tiny CPU's doing little jobs here and there. I'm being you didn't even know they exist. They not only exist, they also have a firmware programmed into ROM's you can never change. The bottom line is you have to trust the manufacturer of the silicon, and that isn't much different to trusting someone else who loaded firmware into the device.

The fact that there is always something you must trust in a device, as opposed to being able to prove it's trustworthy to yourself by just looking at it is so well known it has a name: is called the root of trust.

The interesting thing is it can ensure the root of trust the only thing you need to trust. The ability to do that makes your statement factually wrong. In fact it's drop dead simple. The root of trust only need let you read all firmware you loaded back, so you can verify it is what you would have loaded yourself. TPM's and secure boot are built around doing just that. Secure boot is how the banks and whoever else know you are running a copy of Android produced by Google.

replies(1): >>46268486 #
7. rstuart4133 ◴[] No.46258978[source]
Fair enough. Sort of. You can get the same assurances OTP gives you using secure boot + open source + reproducible builds.

Regardless the rest us who don't want to go through the extra work OTP creates still of use want to put our credit cards, fido2 keys, government licences, concert tickets and whatever else in one general purpose computing device so we don't have to carry lots of little auth devices. To do pull that off securely this device must have firmware I can not change.

The OP wants to make it illegal to sell a device with firmware I can not change.

In asking for that, they've demonstrated they don't have a clue how secure and opening computing works. If they somehow got it implemented it would be a security disaster for them and everybody else.

8. pabs3 ◴[] No.46268486{3}[source]
A compromise; if the manufacturer has a way to reprogram them, then the users should be able to as well.
replies(1): >>46271109 #
9. rstuart4133 ◴[] No.46271109{4}[source]
Hey pabs, think about it. You know this doesn't work.

It doesn't work for the same reason the electricity company doesn't let you reprogram your electricity meter. Unlike the raucous response here as far as I far as I can tell, no one complains about that arrangement, despite the fact the meter is on your property, on land you own, and you effectively pay for it. They put up with it because of want the electricity, they know the electricity can't trust all their customers with metering it, and when it's all said and done putting a small box on their property the electricity has absolute control over is hardly a big deal.

It's exactly the same deal with your computer, or should be. There is a little area on a device you own that you have no control over. Ideally visible and running open source software with reproducible builds, so you can verify it does what it says on the box, and yes neither you nor anyone else can change it, so it meets your condition.

But it's purpose doesn't. It's purpose is to load the equivalent of electricity meters, which are software other people can change and you can't. Thus this area on the your device carves out others areas it can give ironclad guarantees to a third party they solely control, you can not reprogram, and you can't even see the secrets they store there (like encryption keys). These areas don't meet your definition. The third party can reprogram them, but you can't, you can't even see into them.

These areas can do things like behave like a credit cards, be a phones eSim, house a FIDO2 key that some their party attests is only ever stored securely.

Currently we depend on the likes of Google and Apple to provide us with this. I'm not sure Apple can be said to provide it, as they insist on vetting everything you can run that doesn't live in a browser. Google does better because you can side load, if you are willing to jump through hoops must people can't. Wouldn't it be great if debian could do it too? But to pull that off, debian developers would have to be believe allowing users to hand over control of a space on their computer they can't see or alter, to a third party debian didn't trust somehow works open source. It's not a big jump from the current firmware policy.

replies(2): >>46272017 #>>46285859 #
10. account42 ◴[] No.46271997[source]
> The second is places I interact with, like banks, insist on having guarantees on the devices I use to authenticate myself. Devices like a credit card. "I promise to never reprogram this card so it debits someone else's account" simply won't fly with them.

If that's the only option they have, it will fly. Just like you used to be able to use banking apps with any Android before they had the option to restrict that to only Google-controlled ones.

11. account42 ◴[] No.46272017{5}[source]
> It doesn't work for the same reason the electricity company doesn't let you reprogram your electricity meter

It's not your electricity meter, it belongs to the electricity company. There is no pretense that you own it.

> It's exactly the same deal with your computer, or should be. There is a little area on a device you own that you have no control over.

No thanks. Society has functioned thousands of years without something like that.

12. pabs3 ◴[] No.46285859{5}[source]
Strongly disagree with all of that.