←back to thread

2071 points K0nserv | 2 comments | | HN request time: 0.439s | source
Show context
idle_zealot ◴[] No.45088298[source]
This makes the point that the real battle we should be fighting is not for control of Android/iOS, but the ability to run other operating systems on phones. That would be great, but as the author acknowledges, building those alternatives is basically impossible. Even assuming that building a solid alternative is feasible, though, I don't think their point stands. Generally I'm not keen on legislatively forcing a developer to alter their software, but let's be real: Google and Apple have more power than most nations. I'm all for mandating that they change their code to be less user-hostile, for the same reason I prefer democracy to autocracy. Any party with power enough to impact millions of lives needs to be accountable to those it affects. I don't see the point of distinguishing between government and private corporation when that corporation is on the same scale of power and influence.
replies(14): >>45088317 #>>45088413 #>>45088437 #>>45088617 #>>45088634 #>>45088767 #>>45088805 #>>45088812 #>>45089073 #>>45089349 #>>45089473 #>>45089554 #>>45089569 #>>45091038 #
AtlasBarfed ◴[] No.45088617[source]
This is one of the real canaries I watch on "real AI" for programming.

It should be able to make an OS. It should be able to write drivers. It should be able to port code to new platforms. It should be able to transpile compiled binaries (which are just languages of a different language) across architectures.

Sure seems we are very far from that, but really these are breadth-based knowledge with extensive examples / training sources. It SHOULD be something LLMs are good at, not new/novel/deep/difficult problems. What I described are labor-intensive and complicated, but not "difficult".

And would any corporate AI allow that?

We should be pretty paranoid about centralized control attempts, especially in tech. This is a ... fragile ... time.

replies(3): >>45088673 #>>45088679 #>>45089838 #
ACCount37 ◴[] No.45088679[source]
AI kicks ass at a lot of "routine reverse engineering" tasks already.

You can feed it assembly listings, or bytecode that the decompiler couldn't handle, and get back solid results.

And corporate AIs don't really have a fuck to give, at least not yet. You can sic Claude on obvious decompiler outputs, or a repo of questionable sources with a "VERY BIG CORPO - PROPRIETARY AND CONFIDENTIAL" in every single file, and it'll sift through it - no complaints, no questions asked. And if that data somehow circles back into the training eventually, then all the funnier.

replies(1): >>45089113 #
1. AtlasBarfed ◴[] No.45089113[source]
That's one of the boil-ups. Why would lack of Linux compatibility for hardware be a thing? If AI can write the drivers in 1/10th the effort/time, it should be a game changer for open source.

I haven't heard much from the major projects yet, but I'm not ear-to-the-ground.

I guess that is what is disappointing. It's all (to quote n-gage) webshit you see being used for this, and corpo-code so far, to your point.

replies(1): >>45089171 #
2. ACCount37 ◴[] No.45089171[source]
AI can't write full drivers, and certainly not to mainline Linux quality. But it does make "take apart a proprietary driver to figure out how it works" much easier.