←back to thread

160 points sjuut | 10 comments | | HN request time: 0.315s | source | bottom
1. mjg59 ◴[] No.44611125[source]
Having spent a while working in embedded and learning that this is not a lesson that's been internalised: this is why you never sign any executable that can boot on shipped hardware unless you'd be ok with everyone running it on shipped hardware. You can not promise it will not leak. You can not promise all copies will be destroyed. If it needs to run on production hardware then you should have some per-device mechanism for one-off signatures, and if it doesn't then it should either be unsigned (if fusing secure boot happens late) or have the signature invalidated as the last thing that happens before the device is put in the box.

A lot of companies do not appear to understand this. A lot of devices with silicon-level secure boot can be circumvented with signed images that have just never (officially) been distributed to the public, and anyone relying on their security is actually relying on vendors never accidentally trashing a drive containing one. In this case Nintendo (or a contractor) utterly failed to destroy media in the way they were presumably supposed to, but it would have been better to have never existed in this form in the first place.

replies(4): >>44611219 #>>44611369 #>>44614565 #>>44616825 #
2. bri3d ◴[] No.44611219[source]
I think they _might_ have thought a little farther than this; as far as I can tell this tool was _supposed_ to only boot images with the same security checks as the actual fused state of the console, and the issue was that the section header parsing code was vulnerable to a trivial attack which allowed arbitrary execution, which of course could then bypass the lifecycle state checks.

I'd extend your thesis to "you need to audit your recovery tools with the _exact same_ level of scrutiny with which you audit your production secondary bootloader, because they're effectively the same thing," which is the same concept but not _quite_ as boneheaded as you suggest.

Recently, I see this class of exploit more commonly, too: stuff like "there's a development bootloader signed with production keys" has gone away a little, replaced with "there's a recovery bootloader with signature checking that's broken in some obvious way." Baby steps, I guess...

3. josephcsible ◴[] No.44611369[source]
I don't like this advice because it seems like it's only useful to people who want to do tivoization in the first place. I hope people who try to do that keep failing at it, because "success" is bad for the rest of us.
replies(3): >>44611740 #>>44612230 #>>44612944 #
4. dlenski ◴[] No.44611740[source]
Agreed. I'm rooting for the continued failure of everyone who locks down hardware (and software) to prevent its users from modifying or fully controlling it.
5. mjg59 ◴[] No.44612230[source]
At a social level we should know how to do this well because there are cases where it needs to be done well. Some hardware is operating in incredibly safety critical scenarios where you do want to have strong confidence that it's running the correct software[1].

Should this be shipped to consumers as a default? Fuck no. This technology needs to exist for safety, but that doesn't mean it should be used to prop up business models. Unfortunately there's no good technical mechanism to prevent technology being used in user-hostile ways, and we're left with social pressure. We should be organising around that social pressure rather than refusing to talk about the tech.

[1] and let's not even focus on the "Someone hacked it" situation - what if it accidentally shipped with an uncertified debug build? This seems implausible, but when Apple investigated the firmware they'd shipped on laptops they found that some machines had been pulled off the production line, had a debug build installed to validate something, and had then been put back on the production line without a legitimate build being installed - and if Apple can get this wrong, everyone can get this wrong

replies(2): >>44612681 #>>44613378 #
6. Cerium ◴[] No.44612681{3}[source]
Great point, in general I find that the story for security is always hackers but the result is that far more commonly you hack yourself with manufacturing process variation.
7. RainyDayTmrw ◴[] No.44612944[source]
I think Apple is time and again proof that Tivoization is highly effective, and that if we want to fight it, the fight needs to be legal, not technical, as much as that may dismay the technically inclined.
8. c0l0 ◴[] No.44613378{3}[source]
Alas, it will virtually exclusively "be shipped to consumers as a default".
9. Nursie ◴[] No.44614565[source]
> never sign any executable that can boot on shipped hardware unless you'd be ok with everyone running it on shipped hardware.

How about if, when the lead engineers are on holiday, you ship the first batch of production units with a root a key that’s on everyone’s laptop and has been pushed to bitbucket, and been used to sign all sorts of things for dev units? Then, when confronted with that, you say “oh right, well… can we delete it from those places and import the key to the HSM? We’ll use it as the prod key going forwards?”

I was sad when that payment terminal never made it to market, but in the end perhaps it was for the best.

10. BobbyTables2 ◴[] No.44616825[source]
Indeed, having done the whole developement/test/manufacturing workflow using hardware based secure boot , I realized likely very few people ever do it properly.

We had developer keys and production keys. Burning one-time fuses with the production key meant developer code would be rejected.

It took a high amount of discipline and a lot of work in the build process (separate developer/production builds of components and corresponding signing).

Very few people had access to the production signing mechanism and I avoided signing root enabled builds, even though such would be extremely convenient. Other teams… freely published production signed internal use developer firmware internally (to the whole company).

Sadly, nobody gets an award for doing it right, and rarely face consequences for doing it wrong.