←back to thread

1045 points mfiguiere | 1 comments | | HN request time: 0.224s | source
Show context
btown ◴[] No.39345221[source]
Why would this not be AMD’s top priority among priorities? Someone recently likened the situation to an Iron Age where NVIDIA owns all the iron. And this sounds like AMD knowing about a new source of ore and not even being willing to sink a single engineer’s salary into exploration.

My only guess is they have a parallel skunkworks working on the same thing, but in a way that they can keep it closed-source - that this was a hedge they think they no longer need, and they are missing the forest for the trees on the benefits of cross-pollination and open source ethos to their business.

replies(14): >>39345241 #>>39345302 #>>39345393 #>>39345400 #>>39345458 #>>39345853 #>>39345857 #>>39345893 #>>39346210 #>>39346792 #>>39346857 #>>39347433 #>>39347900 #>>39347927 #
fariszr ◴[] No.39345241[source]
According to the article, AMD seems to have pulled the plug on this as they think it will hinder ROCMv6 adoption, which still btw only supports two consumer cards out of their entire line up[1]

1. https://www.phoronix.com/news/AMD-ROCm-6.0-Released

replies(4): >>39345503 #>>39345558 #>>39346200 #>>39346480 #
kkielhofner ◴[] No.39345558[source]
With the most recent card being their one year old flagship ($1k) consumer GPU...

Meanwhile CUDA supports anything with Nvidia stamped on it before it's even released. They'll even go as far as doing things like adding support for new GPUs/compute families to older CUDA versions (see Hopper/Ada and CUDA 11.8).

You can go out and buy any Nvidia GPU the day of release, take it home, plug it in, and everything just works. This is what people expect.

AMD seems to have no clue that this level of usability is what it will take to actually compete with Nvidia and it's a real shame - their hardware is great.

replies(5): >>39345774 #>>39345894 #>>39346438 #>>39346550 #>>39346788 #
KingOfCoders ◴[] No.39345774[source]
AMD thinks the reason Nvidia is ahead of them is bad marketing on their part, and good marketing (All is AI) by Nvidia. They don't see the difference in software stacks.

For years I want to get off the Nvidia train for AI, but I'm forced to buy another Nvidia card b/c AMD stuff just doesn't work, and all examples work with Nvidia cards as they should.

replies(1): >>39346187 #
1. fortran77 ◴[] No.39346187[source]
At the risk of sounding like Jeff Ballmer, the reason I only use NVIDIA for GPGPU work (our company does a lot of it!) is the developer support. They have compilers, tools, documentation, and tech support for developers who want to do any type of GPGPU computing on their hardware that just isn't matched on any other platform.