Most active commenters
  • jandrese(3)

←back to thread

1045 points mfiguiere | 26 comments | | HN request time: 4.234s | source | bottom
Show context
btown ◴[] No.39345221[source]
Why would this not be AMD’s top priority among priorities? Someone recently likened the situation to an Iron Age where NVIDIA owns all the iron. And this sounds like AMD knowing about a new source of ore and not even being willing to sink a single engineer’s salary into exploration.

My only guess is they have a parallel skunkworks working on the same thing, but in a way that they can keep it closed-source - that this was a hedge they think they no longer need, and they are missing the forest for the trees on the benefits of cross-pollination and open source ethos to their business.

replies(14): >>39345241 #>>39345302 #>>39345393 #>>39345400 #>>39345458 #>>39345853 #>>39345857 #>>39345893 #>>39346210 #>>39346792 #>>39346857 #>>39347433 #>>39347900 #>>39347927 #
1. izacus ◴[] No.39345400[source]
Why do you think running after nVidia for this submarket is a good idea for them? The AMD GPU team isn't especially big and the development investment is massive. Moreover, they'll have the opportunity cost for projects they're now dominating in (all game consoles for example).

Do you expect them to be able to capitalize on the AI fad so much (and quickly enough!) that it's worth dropping the ball on projects they're now doing well in? Or perhaps continue investing into the part of the market where they're doing much better than nVidia?

replies(9): >>39345487 #>>39345509 #>>39345522 #>>39345581 #>>39345589 #>>39345620 #>>39345680 #>>39345715 #>>39346064 #
2. jandrese ◴[] No.39345487[source]
If the alternative it to ignore one of the biggest developing markets then yeah, maybe they should start trying to catch up. Unless you think GPU compute is a fad that's going to fizzle out?
replies(1): >>39345547 #
3. nindalf ◴[] No.39345509[source]
AMD is betting big on GPUs. They recently released the MI300, which has "2x transistors, 2.4x memory and 1.6x memory bandwidth more than the H100, the top-of-the-line artificial-intelligence chip made by Nvidia" (https://www.economist.com/business/2024/01/31/could-amd-brea...).

They very much plan to compete in this space, and hope to ship $3.5B of these chips in the next year. Small compared to Nvidia's revenues of $59B (includes both consumer and data centre), but AMD hopes to match them. It's too big a market to ignore, and they have the hardware chops to match Nvidia. What they lack is software, and it's unclear if they'll ever figure that out.

replies(1): >>39346576 #
4. throwawaymaths ◴[] No.39345522[source]
IIRC (this could be old news) AMD GPUs are preferred in the supercomputer segment because they offer better flops/unit energy. However without a cuda-like you're missing out on the AI part of supercompute, which is increasing proportion.

The margins on supercompute-related sales are very high. Simplifying, but you can basically take a consumer chip, unlock a few things, add more memory capacity, relicense, and your margin goes up by a huge factor.

replies(2): >>39345747 #>>39346067 #
5. izacus ◴[] No.39345547[source]
One of the most important decisions a company can do, is to decide which markets they'll focus in and which they won't. This is even true for megacorps (see: Google and their parade of messups). There's just not enough time to be in all markets all at once.

So, again, it's not at all clear that AMD being in the compute GPU game is the automatic win for them in the future. There's plenty of companies that killed themselves trying to run after big profitable new fad markets (see: Nokia and Windows Phone, and many other cases).

So let's examine that - does AMD actually have a good shot of taking a significant chunk of market that will offset them not investing in some other market?

replies(4): >>39345850 #>>39345929 #>>39346121 #>>39346190 #
6. hnlmorg ◴[] No.39345581[source]
GPU for compute has been a thing since the 00s. Regardless of whether AI is a fad (it isn't, but we can agree to disagree on this one) not investing more in GPU compute is a weird decision.
7. FPGAhacker ◴[] No.39345589[source]
It was Microsoft’s strategy for several decades (outsiders called it embrace, extend, extinguish, only partially in jest). It can work for some companies.
8. currymj ◴[] No.39345620[source]
everyone buying GPUs for AI and scientific workloads wishes AMD was a viable option, and this has been true for almost a decade now.

the hardware is already good enough, people would be happy to use it and accept that's it's not quite as optimized for DL as Nvidia.

people would even accept that the software is not as optimized as CUDA, I think, as long as it is correct and reasonably fast.

the problem is just that every time i've tried it, it's been a pain in the ass to install and there are always weird bugs and crashes. I don't think it's hubris to say that they could fix these sorts of problems if they had the will.

9. bonton89 ◴[] No.39345680[source]
AMD also has the problem that they make much better margins on their CPUs than on their GPUs and there are only so many TSMC wafers. So in a way making more GPUs is like burning up free money.
10. carlossouza ◴[] No.39345715[source]
Because the supply for this market is constrained.

It's a pure business decision based on simple math.

If the estimated revenues from selling to the underserved market are higher than the cost of funding the project (they probably are, considering the obscene margins from NVIDIA), then it's a no-brainer.

11. Symmetry ◴[] No.39345747[source]
It's more that the resource balance in AMD's compute line of GPUs (the CDNA ones) has been more focused on the double precision operations that most supercomputer code makes heavy use of.
replies(1): >>39351166 #
12. jandrese ◴[] No.39345850{3}[source]
AMD is literally the only company on the market poised to exploit the explosion in demand for GPU compute after nVidia (sorry Intel). To not even really try to break in is insanity. nVidia didn't grow their market cap by 5x over the course of a year because people really got into 3D gaming. Even as an also ran on the coat tails of nVidia with a compatibility glue library the market is clearly demanding more product.
replies(2): >>39345979 #>>39348734 #
13. thfuran ◴[] No.39345929{3}[source]
Investing in what other market?
14. justinclift ◴[] No.39345979{4}[source]
Isn't Intel's next gen GPU supposed to be pretty strong on compute?

Read an article about it recently, but when trying to remember the details / find it again just now I'm not seeing it. :(

replies(3): >>39346397 #>>39346939 #>>39348206 #
15. yywwbbn ◴[] No.39346064[source]
Because their current market valuation was massively inflated because of the AI/GPU boom and/or bubble?

In rational world their stock price would collapse if they don’t focus on it and are unable to deliver anything competitive in the upcoming year or two

> of the market where they're doing much better than nVidia?

So the market that’s hardly growing, Nvidia is not competing in and Intel still has bigger market share and is catching up performance wise? AMD’s valuation is this highly only because they are seen as the only company that could directly compete with Nvidia in the data center GPU market.

16. anonylizard ◴[] No.39346067[source]
They are preferred not because of inherent superiority of AMD GPUs. But simply because they have to price lower and have lower margins.

Nvidia could always just half their prices one day, and wipe out every non-state-funded competitor. But Nvidia prefers to collect their extreme margins and funnel it into even more R&D in AI.

17. yywwbbn ◴[] No.39346121{3}[source]
> So, again, it's not at all clear that AMD being in the compute GPU game is the automatic win for them in the future. There's

You’re right about that but it seems that it’s pretty clear that not being in the compute GPU game is an automatic loss for them (look at their recent revenue growth in the past quarter and two by in each sector)

18. imtringued ◴[] No.39346190{3}[source]
Are you seriously telling me they shouldn't invest into one of their core markets? The necessary investments are probably insignificant. Let's say you need a budget of 10 million dollars (50 developers) to assemble a dev team to fix ROCM. How many 7900 XTX to break even on revenue? Roughly 9000. How many did they sell? I'm too lazy to count but Mindfactory a German online shop alone sold around 6k units.
19. jandrese ◴[] No.39346397{5}[source]
Intel is trying, but all of their efforts thus far have been pretty sad and abortive. I don't think anybody is taking them seriously at this point.
20. incrudible ◴[] No.39346576[source]
They are trying to compete in the segment of data center market where the shots are called by bean counters calculating FLOPS per dollar.
replies(2): >>39347383 #>>39347841 #
21. spookie ◴[] No.39346939{5}[source]
Their OneAPI is really interesting!
22. BearOso ◴[] No.39347383{3}[source]
A market where Nvidia chips are all bought out, so what's left?
23. latchkey ◴[] No.39347841{3}[source]
That's why I'm going to democratize that business and make it available to anyone who wants access. How does bare metal rentals of MI300x and top end Epyc CPUs sound? We take on the capex/opex/risk and give people what they want, which is access to HPC clusters.
24. 7speter ◴[] No.39348206{5}[source]
I'm not an expert like you would find here on HN, I am only really a tinkerer and learner, amateur at best, but I think Intel's compute is very promising on Alchemist. The A770 beats out the 4060ti 16gb in video rendering via Davinci Resolve and Adobe; has AV1 support in free Davinci Resolve while Lovelace only has AV1 support in studio. Then for AI, the A770 has had a good showing in stable diffsion against Nvidia's midrange Lovelace since the summer: https://www.tomshardware.com/news/stable-diffusion-for-intel...

The big issue for Intel is pretty similar to that of AMD; everything is made for CUDA, and Intel has to either build their own solutions or convince people to build support for Intel. While I'm working on learning AI and plan to use an Nvidia card, its pretty the progress Intel has made in the last couple of years since introducing their first GPU to market has been pretty wild, and I think it really give AMD pause.

25. atq2119 ◴[] No.39348734{4}[source]
They are breaking in, though. By all accounts, MI300s are being sold as fast as they can make them.
26. throwawaymaths ◴[] No.39351166{3}[source]
Thanks for clarifying! I had a feeling I had my story slightly wrong