←back to thread

1045 points mfiguiere | 2 comments | | HN request time: 0.017s | source
Show context
btown ◴[] No.39345221[source]
Why would this not be AMD’s top priority among priorities? Someone recently likened the situation to an Iron Age where NVIDIA owns all the iron. And this sounds like AMD knowing about a new source of ore and not even being willing to sink a single engineer’s salary into exploration.

My only guess is they have a parallel skunkworks working on the same thing, but in a way that they can keep it closed-source - that this was a hedge they think they no longer need, and they are missing the forest for the trees on the benefits of cross-pollination and open source ethos to their business.

replies(14): >>39345241 #>>39345302 #>>39345393 #>>39345400 #>>39345458 #>>39345853 #>>39345857 #>>39345893 #>>39346210 #>>39346792 #>>39346857 #>>39347433 #>>39347900 #>>39347927 #
hjabird ◴[] No.39345853[source]
The problem with effectively supporting CUDA is that encourages CUDA adoption all the more strongly. Meanwhile, AMD will always be playing catch-up, forever having to patch issues, work around Nvidia/AMD differences, and accept the performance penalty that comes from having code optimised for another vendor's hardware. AMD needs to encourage developers to use their own ecosystem or an open standard.
replies(13): >>39345944 #>>39346147 #>>39346166 #>>39346182 #>>39346270 #>>39346295 #>>39346339 #>>39346835 #>>39346941 #>>39346971 #>>39347964 #>>39348398 #>>39351785 #
slashdev ◴[] No.39345944[source]
With Nvidia controlling 90%+ of the market, this is not a viable option. They'd better lean hard into CUDA support if they want to be relevant.
replies(1): >>39346142 #
cduzz ◴[] No.39346142[source]
A bit of story telling here:

IBM and Microsoft made OS/2. The first version worked on 286s and was stable but useless.

The second version worked only on 386s and was quite good, and even had wonderful windows 3.x compatibility. "Better windows than windows!"

At that point Microsoft wanted out of the deal and they wanted to make their newer version of windows, NT, which they did.

IBM now had a competitor to "new" windows and a very compatible version of "old" windows. Microsoft killed OS2 by a variety of ways (including just letting IBM be IBM) but also by making it very difficult for last month's version of OS/2 to run next month's bunch of Windows programs.

To bring this back to the point -- IBM vs Microsoft is akin to AMD vs Nvidia -- where nvidia has the standard that AMD is implementing, and so no matter what if you play in the backward compatibility realm you're always going to be playing catch-up and likely always in a position where winning is exceedingly hard.

As WOPR once said "interesting game; the only way to win is to not play."

replies(4): >>39346304 #>>39346399 #>>39347110 #>>39348097 #
incrudible ◴[] No.39346399[source]
Windows before NT was crap, so users had an incentive to upgrade. If there had existed a Windows 7 alternative that was near fully compatible and FOSS, I would wager Microsoft would have lost to it with Windows 8 and even 10. The only reason to update for most people was Microsoft dropping support.

For CUDA, it is not just AMD who would need to catch up. Developers also are not necessarily going to target the latest feature set immediately, especially if it only benefits (or requires) new hardware.

I accept the final statement, but that also means AMD for compute is gonna be dead like OS/2. Their stack just will not reach critical mass.

replies(1): >>39347325 #
BizarroLand ◴[] No.39347325[source]
Todays linux OS's would have competed incredibly strongly against Vista and probably would have gone blow for blow against 7.

Proton, Wine, and all of the compatibility fixes and drive improvements that the community has made in the last 16 years has been amazing, and every day is another day where you can say that it has never been easier to switch away from Windows.

However, Microsoft has definitely been drinking the IBM koolaid a little to long and has lost the mandate of heaven. I think in the next 7-10 years we will reach a point where there is nothing Windows can do that linux cannot do better and easier without spying on you, and we may be 3-5 years from a "killer app" that is specifically built to be incompatible with Windows just as a big FU to them, possibly in the VR world, possibly in AR, and once that happens maybe, maybe, maybe it will finally actually be the year of the linux desktop.

replies(3): >>39348271 #>>39348906 #>>39353772 #
paulmd ◴[] No.39348271[source]
> However, Microsoft has definitely been drinking the IBM koolaid a little to long and has lost the mandate of heaven. I think in the next 7-10 years we will reach a point where there is nothing Windows can do that linux cannot do better and easier without spying on you

that's a fascinating statement with the clear ascendancy of neural-assisted algorithms etc. Things like DLSS are the future - small models that just quietly optimize some part of a workload that was commonly considered impossible to the extent nobody even thinks about it anymore.

my prediction is that in 10 years we are looking at the rise of tag+collection based filesystems and operating system paradigms. all of us generate a huge amount of "digital garbage" constantly, and you either sort it out into the important stuff, keep temporarily, and toss, or you accumulate a giant digital garbage pile. AI systems are gonna automate that process, it's gonna start on traditional tree-based systems but eventually you don't need the tree at all, AI is what's going to make that pivot to true tag/collection systems possible.

Tags mostly haven't worked because of a bunch of individual issues which are pretty much solved by AI. Tags aren't specific enough: well, AI can give you good guesses at relevance. Tagging files and maintaining collections is a pain: well, the AI can generate tags and assign collections for you. Tags really require an ontology for "fuzzy" matching (search for "food" should return the tag "hot dog") - well, LLMs understand ontologies fine. Etc etc. And if you do it right, you can basically have the AI generate "inbox/outbox" for you, deduplicate files and handle versioning, etc, all relatively seamlessly.

microsoft and macos are both clearly racing for this with the "AI os" concept. It's not just better relevance searches etc. And the "generate me a whole paragraph before you even know what I'm trying to type" stuff is not how it's going to work either. That stuff is like specular highlights in video games around 2007 or whatever - once you had the tool, for a few years everything was w e t until developers learned some restraint with it. But there are very very good applications that are going to come out in the 10 year window that are going to reduce operator cognitive load by a lot - that is the "AI OS" concept. What would the OS look like if you truly had the "computer is my secretary" idea? Not just dictating memorandums, but assistance in keeping your life in order and keeping you on-task.

I simply cannot see linux being able to keep up with this change, in the same way the kernel can't just switch to rust - at some point you are too calcified to ever do the big-bang rewrite if there is not a BDFL telling you that it's got to happen.

the downside of being "the bazaar" is that you are standards-driven and have to deal with corralling a million whiny nerds constantly complaining about "spying on me just like microsoft" and continuing to push in their own other directions (sysvinit/upstart/systemd factions, etc) and whatever else, on top of all the other technical issues of doing a big-bang rewrite. linux is too calcified to ever pivot away from being a tree-based OS and it's going to be another 2-3 decades before they catch up with "proper support for new file-organization paradigms" etc even in the smaller sense.

that's really just the tip of the iceberg on the things AI is going to change, and linux is probably going to be left out of most of those commercial applications despite being where the research is done. It's just too much of a mess and too many nerdlingers pushing back to ever get anything done. Unix will be represented in this new paradigm but not Linux - the commercial operators who have the centralization and fortitude to build a cathedral will get there much quicker, and that looks like MacOS or Solaris not linux.

Or at least, unless I see some big announcement from KDE or Gnome or Canonical/Red Hat about a big AI-OS rewrite... I assume that's pretty much where the center of gravity is going to stay for linux.

replies(2): >>39349554 #>>39350910 #
incrudible ◴[] No.39350910[source]
"Neural assisted algorithms" are just algorithms with large lookup tables. Another magnitude of binary bloat, but that's nothing we haven't experienced before. There's no need to fundamentally change the OS paradigm for it.
replies(1): >>39351693 #
paulmd ◴[] No.39351693[source]
I think we're well past the "dlss is just FSR2 with lookup tables, you can ALWAYS replicate the outcomes of neural algorithms with deterministic ones" phase, imo.

if that's the case you have billion-dollar opportunities waiting for you to prove it!

replies(1): >>39361900 #
1. incrudible ◴[] No.39361900[source]
Floating point inaccuracies and random seeds aside, something like DLSS is entirely deterministic. It is just a bunch of matrix multiplications.
replies(1): >>39407619 #
2. paulmd ◴[] No.39407619[source]
You can’t possibly expect me to take your post seriously when there’s not even any true evidence of cognition involved in its writing. Just some meat flopping around spastically from some chemicals pumped up from the gut, and electrical zaps from the nervous system.

We can see that it’s not magic, the neuron either activates or it doesn’t, so why should I pay attention to some probabilistic steam of gibberish it spewed out? There is nothing meaningful that can be inferred from such systems, right?