←back to thread

224 points azhenley | 7 comments | | HN request time: 0.887s | source | bottom
Show context
nhod ◴[] No.45075076[source]
A million years ago in AI time, AKA yesterday, there was a HN post from John Carmack talking about how Meta wasted a ton of time and money making XROS and how nowadays it doesn’t make any sense to write a new OS [1].

And then this post today which makes a very strong case for it. (Yes, a VM isn’t an entire OS, Yes, it would be lighter weight than a complete OS. Yes, it would be industry-wide. Yes, we’d likely use an existing OS or codebase to start. Yes, nuance.)

[1] https://news.ycombinator.com/item?id=45066395

replies(4): >>45075289 #>>45075402 #>>45075695 #>>45078397 #
1. 7373737373 ◴[] No.45075695[source]
WebAssembly with its sandboxing-by-default paradigm is pretty much halfway there, just need a well defined interface to transfer data and access rights between instances, and creating new instances from others.
replies(3): >>45075718 #>>45075919 #>>45081985 #
2. spankalee ◴[] No.45075718[source]
That's what WASI components already are.
3. cpuguy83 ◴[] No.45075919[source]
https://microsoft.github.io/wassette/ does just this, using wasi components.
4. saagarjha ◴[] No.45081985[source]
This is a technical solution to a social problem
replies(1): >>45082543 #
5. 7373737373 ◴[] No.45082543[source]
This is NOT (just) a social problem! See: supply chain attacks and https://en.wikipedia.org/wiki/Confused_deputy_problem

No number of signature schemes and trust networks will be able to prevent the effects of actual misuse of security breaches and problems arising from programming errors, only a technical solution can!

It's stupid to rely on trust when one doesn't have to, to grant programs, imported modules or even individual functions more permissions than they need. Technical systems should give the best guarantees they can, and not risk the security of the entire system by default just because something failed at the social layer, or some component somewhere in the system misused (perhaps even by accident!) its https://en.wikipedia.org/wiki/Ambient_authority

The attack surface of even individual programs can, and therefore SHOULD be minimizable. It's just that contemporary popular programming languages do not give programmers any methods (high level or even primitive) to achieve interior compartmentalization and utilize https://en.wikipedia.org/wiki/Capability-based_security in order to implement the https://en.wikipedia.org/wiki/Principle_of_least_privilege

A program does what it does, and it always potentially could do everything it is allowed to. Especially at scale, when you use code from thousands of developers, along the depth and breadth of your tech stack, social trust doesn't scale. Reifying and making explicit the access rights the components of a program have does. Then, ill effects are limited to the rights that have been explicitly given, and the effects of the results that are further processed by other components of the program.

Social assurances are practically worthless because they may be misinterpreted, bypassed, subverted or coerced. Technical guarantees instead can be formalized and verifiable.

replies(2): >>45090871 #>>45092119 #
6. lucketone ◴[] No.45090871{3}[source]
> Technical systems should give the best guarantees they can, and not risk the security of the entire system by default

True and at the same time this has the social aspect - somebody needs to list all the required capabilities/accesses, and developer might opt for requesting for too many permissions and casual user might allow that (caused by mix of incompetence and lack of interest)

7. saagarjha ◴[] No.45092119{3}[source]
I view a breach to be a socially-determined outcome, though. Yes, your library might be sandboxed, but for it accessing the internet might be OK and for you that means you are leaking PII. This is a difficult problem to solve.