←back to thread

2071 points K0nserv | 1 comments | | HN request time: 4.078s | source
Show context
idle_zealot ◴[] No.45088298[source]
This makes the point that the real battle we should be fighting is not for control of Android/iOS, but the ability to run other operating systems on phones. That would be great, but as the author acknowledges, building those alternatives is basically impossible. Even assuming that building a solid alternative is feasible, though, I don't think their point stands. Generally I'm not keen on legislatively forcing a developer to alter their software, but let's be real: Google and Apple have more power than most nations. I'm all for mandating that they change their code to be less user-hostile, for the same reason I prefer democracy to autocracy. Any party with power enough to impact millions of lives needs to be accountable to those it affects. I don't see the point of distinguishing between government and private corporation when that corporation is on the same scale of power and influence.
replies(14): >>45088317 #>>45088413 #>>45088437 #>>45088617 #>>45088634 #>>45088767 #>>45088805 #>>45088812 #>>45089073 #>>45089349 #>>45089473 #>>45089554 #>>45089569 #>>45091038 #
AtlasBarfed ◴[] No.45088617[source]
This is one of the real canaries I watch on "real AI" for programming.

It should be able to make an OS. It should be able to write drivers. It should be able to port code to new platforms. It should be able to transpile compiled binaries (which are just languages of a different language) across architectures.

Sure seems we are very far from that, but really these are breadth-based knowledge with extensive examples / training sources. It SHOULD be something LLMs are good at, not new/novel/deep/difficult problems. What I described are labor-intensive and complicated, but not "difficult".

And would any corporate AI allow that?

We should be pretty paranoid about centralized control attempts, especially in tech. This is a ... fragile ... time.

replies(3): >>45088673 #>>45088679 #>>45089838 #
beeflet ◴[] No.45089838[source]
>It should be able to make an OS. It should be able to write drivers.

How is it going to do that without testing (and potentially bricking) hardware in real life?

>It should be able to transpile compiled binaries (which are just languages of a different language) across architectures

I don't know why you would use an LLM to do that. Couldn't you just distribute the binaries in some intermediate format, or decompile them to a comprehensible source format first?

replies(1): >>45094202 #
1. AtlasBarfed ◴[] No.45094202[source]
I agree that it's a challenging problem.

My line of thinking is that AI essentially is really good at breadth-based problems wide knowledge.

An operating system is a specific well-known set of problems. Generally, it's not novel technology involved. An OS is a massive amount of work. Technical butrudgerous work.

If there's a large amount of source code, a great deal of discussion on that source code, and lots of other working examples, and you're really just kind of doing a derivative n + 1 design or adaptation of an existing product, that sounds like something in llm can do

Obviously I'm not talking about vibe, coding and OS. But could an OS do 99% of that and vastly reduce the amount of work to get a OS to work with your hardware with the big assumption that you have access to specs or some way of doing that?