Most active commenters
  • jandrese(3)

←back to thread

1045 points mfiguiere | 11 comments | | HN request time: 2.265s | source | bottom
Show context
btown ◴[] No.39345221[source]
Why would this not be AMD’s top priority among priorities? Someone recently likened the situation to an Iron Age where NVIDIA owns all the iron. And this sounds like AMD knowing about a new source of ore and not even being willing to sink a single engineer’s salary into exploration.

My only guess is they have a parallel skunkworks working on the same thing, but in a way that they can keep it closed-source - that this was a hedge they think they no longer need, and they are missing the forest for the trees on the benefits of cross-pollination and open source ethos to their business.

replies(14): >>39345241 #>>39345302 #>>39345393 #>>39345400 #>>39345458 #>>39345853 #>>39345857 #>>39345893 #>>39346210 #>>39346792 #>>39346857 #>>39347433 #>>39347900 #>>39347927 #
izacus ◴[] No.39345400[source]
Why do you think running after nVidia for this submarket is a good idea for them? The AMD GPU team isn't especially big and the development investment is massive. Moreover, they'll have the opportunity cost for projects they're now dominating in (all game consoles for example).

Do you expect them to be able to capitalize on the AI fad so much (and quickly enough!) that it's worth dropping the ball on projects they're now doing well in? Or perhaps continue investing into the part of the market where they're doing much better than nVidia?

replies(9): >>39345487 #>>39345509 #>>39345522 #>>39345581 #>>39345589 #>>39345620 #>>39345680 #>>39345715 #>>39346064 #
1. jandrese ◴[] No.39345487[source]
If the alternative it to ignore one of the biggest developing markets then yeah, maybe they should start trying to catch up. Unless you think GPU compute is a fad that's going to fizzle out?
replies(1): >>39345547 #
2. izacus ◴[] No.39345547[source]
One of the most important decisions a company can do, is to decide which markets they'll focus in and which they won't. This is even true for megacorps (see: Google and their parade of messups). There's just not enough time to be in all markets all at once.

So, again, it's not at all clear that AMD being in the compute GPU game is the automatic win for them in the future. There's plenty of companies that killed themselves trying to run after big profitable new fad markets (see: Nokia and Windows Phone, and many other cases).

So let's examine that - does AMD actually have a good shot of taking a significant chunk of market that will offset them not investing in some other market?

replies(4): >>39345850 #>>39345929 #>>39346121 #>>39346190 #
3. jandrese ◴[] No.39345850[source]
AMD is literally the only company on the market poised to exploit the explosion in demand for GPU compute after nVidia (sorry Intel). To not even really try to break in is insanity. nVidia didn't grow their market cap by 5x over the course of a year because people really got into 3D gaming. Even as an also ran on the coat tails of nVidia with a compatibility glue library the market is clearly demanding more product.
replies(2): >>39345979 #>>39348734 #
4. thfuran ◴[] No.39345929[source]
Investing in what other market?
5. justinclift ◴[] No.39345979{3}[source]
Isn't Intel's next gen GPU supposed to be pretty strong on compute?

Read an article about it recently, but when trying to remember the details / find it again just now I'm not seeing it. :(

replies(3): >>39346397 #>>39346939 #>>39348206 #
6. yywwbbn ◴[] No.39346121[source]
> So, again, it's not at all clear that AMD being in the compute GPU game is the automatic win for them in the future. There's

You’re right about that but it seems that it’s pretty clear that not being in the compute GPU game is an automatic loss for them (look at their recent revenue growth in the past quarter and two by in each sector)

7. imtringued ◴[] No.39346190[source]
Are you seriously telling me they shouldn't invest into one of their core markets? The necessary investments are probably insignificant. Let's say you need a budget of 10 million dollars (50 developers) to assemble a dev team to fix ROCM. How many 7900 XTX to break even on revenue? Roughly 9000. How many did they sell? I'm too lazy to count but Mindfactory a German online shop alone sold around 6k units.
8. jandrese ◴[] No.39346397{4}[source]
Intel is trying, but all of their efforts thus far have been pretty sad and abortive. I don't think anybody is taking them seriously at this point.
9. spookie ◴[] No.39346939{4}[source]
Their OneAPI is really interesting!
10. 7speter ◴[] No.39348206{4}[source]
I'm not an expert like you would find here on HN, I am only really a tinkerer and learner, amateur at best, but I think Intel's compute is very promising on Alchemist. The A770 beats out the 4060ti 16gb in video rendering via Davinci Resolve and Adobe; has AV1 support in free Davinci Resolve while Lovelace only has AV1 support in studio. Then for AI, the A770 has had a good showing in stable diffsion against Nvidia's midrange Lovelace since the summer: https://www.tomshardware.com/news/stable-diffusion-for-intel...

The big issue for Intel is pretty similar to that of AMD; everything is made for CUDA, and Intel has to either build their own solutions or convince people to build support for Intel. While I'm working on learning AI and plan to use an Nvidia card, its pretty the progress Intel has made in the last couple of years since introducing their first GPU to market has been pretty wild, and I think it really give AMD pause.

11. atq2119 ◴[] No.39348734{3}[source]
They are breaking in, though. By all accounts, MI300s are being sold as fast as they can make them.