←back to thread

195 points rbanffy | 1 comments | | HN request time: 0.202s | source
Show context
pie420 ◴[] No.42176400[source]
layperson with no industry knowledge, but it seems like nvidia's CUDA moat will fall in the next 2-5 years. It seems impossible to sustain those margins without competition coming in and getting a decent slice of the pie
replies(5): >>42176440 #>>42177575 #>>42177944 #>>42178259 #>>42179625 #
YetAnotherNick ◴[] No.42177944[source]
CUDA moat is highly overrated for AI in the first place and sold as the reason for the failure of AMD. Almost no one in AI uses CUDA. They only use pytorch or Triton. TPUs didn't face lot of hurdle due to CUDA because they were initially better in terms of price to performance and supported pytorch, tensorflow and jax.

The reason why AMD is behind is that it is behind in hardware. MI300x is more pricey per hour in all the cloud I can find compared to H100, and the MFU is order of magnitude lower compared to NVIDIA for transformers, even though transformers are fully supported. And I get same 40-50% MFU in TPU for the same code. If anyone is investing >10 million dollar for hardware, they sure can invest a million dollar to rewrite everything in whatever language AMD asks them to if it is cheaper.

replies(1): >>42190848 #
1. saagarjha ◴[] No.42190848[source]
People most certainly do use CUDA