←back to thread

596 points pimterry | 1 comments | | HN request time: 0.24s | source
Show context
vessenes ◴[] No.36863209[source]
This feels like such a juicy and divisive area to me. There are an immense number of use cases where we'd like to know we're talking to a 'trusted' hardware and software stack on the web. For many years now, we have just assumed there is little to no trust in the stack, and architected and built accordingly. It adds an amazing amount of complexity and cost, limits features, and makes everything way, way harder than if you could assume a trusted stack.

At the same time, as is being pointed out quite vocally right now, 'trusted' is a very, very difficult concept when large tech monopolies are involved.

On the one hand, it's difficult because there are only a few companies in the world that can field large tech teams that deal with persistent threat actors, and therefore, it would be very nice to be able to trust the security promises made. And, if those promises are trustworthy, they are better promises than any individual can make for their own software and platfoms.

On the other hand, if you're a hacker (in the platonic sense), 'trusted' immediately codes to 'monopoly-backed', along with 'probably back-doored by a local government agency' and we head one more step down the primrose path of control, lack of innovation and finally perhaps a fascistic technology future controlled by a few players.

Ultimately, I think the solution here can only be successful if it involves a trustable, open hardware certification technology that's not registry based, e.g. can create strong local proofs that are independently verifiable. There are a few tech companies I know of working on this on the silicon side, but it's a very difficult problem, and I'm not clear if there's really enough demand to make them viable right now.

I guess I personally come down to leaving this turned on in Safari for now, and seeing what happens over the next year or two.

replies(3): >>36863342 #>>36863609 #>>36865023 #
saurik ◴[] No.36863342[source]
For me it is about for whom the supposed "trust" or "security" is offered: DRM-tech is discussed using these terms, but the goal is to afford trust to the developer or content owner, not the user.
replies(1): >>36866372 #
charcircuit[dead post] ◴[] No.36866372[source]
[flagged]
howinteresting ◴[] No.36866803[source]
I've been following your posts over the past few days, and your philosophy has never been clearer than it is here. You just straight up hate the idea of users having the upper hand over corporations.
replies(1): >>36869174 #
charcircuit ◴[] No.36869174[source]
>You just straight up hate the idea of users having the upper hand over corporations.

Corporations are not the only ones who want their content protected or their services to be secured against cheaters or spammers. Corporations are the ones most capable to invest into improving security. For example look at VRChat avatar artists vs avatar stealers. My philosophy is that if an artist wants to have their avatar secured from being stolen that it is right for the avatars to be protected from users being able to copy them. It is less that I hate users having the upper hand against these artists, but that I would like the artist's desires to be respected. From the trusted computed standpoint I want indie developers to have a chance in creating things without having to invest as much time and money into experince. If you can protect you scoreboard's integrity by trusted computing then the time you have to spend removing hacked scores or detecting people with cheats or licensing an anticheat goes away. It shifts the responsibility from the developer to the platform.

If you've never had something you've made violated by users and have been unable to stop them you may not be able to empathize with people in those situation wishing for a solution to exist.

replies(2): >>36873369 #>>36875096 #
realusername ◴[] No.36873369[source]
As an indie dev, the so called "trusted" environment is an obstacle to product distribution, is increasing friction and actively reducing revenue, I don't want any of that.
replies(1): >>36873981 #
charcircuit ◴[] No.36873981[source]
I would recommend not using it at first. If things start to get problematic you can start requiring a trusted environment for some things. For example you could save a user's highscore even if they are on an untrusted device, but it just won't show up on a global leaderboard.
replies(1): >>36877171 #
1. ◴[] No.36877171[source]