←back to thread

195 points rbanffy | 5 comments | | HN request time: 0s | source
Show context
pie420 ◴[] No.42176400[source]
layperson with no industry knowledge, but it seems like nvidia's CUDA moat will fall in the next 2-5 years. It seems impossible to sustain those margins without competition coming in and getting a decent slice of the pie
replies(5): >>42176440 #>>42177575 #>>42177944 #>>42178259 #>>42179625 #
metadat ◴[] No.42176440[source]
But how will AMD or anyone else push in? CUDA is actually a whole virtualization layer on top of the hardware and isn't easily replicable, Nvidia has been at it for 17 years.

You are right, eventually something's gotta give. The path for this next leg isn't yet apparent to me.

P.s. how much is an exaflop or petaflop, and how significant is it? The numbers thrown around in this article don't mean anything to me. Is this new cluster way more powerful than the last top?

replies(14): >>42176567 #>>42176711 #>>42176809 #>>42177061 #>>42177287 #>>42177319 #>>42177378 #>>42177451 #>>42177452 #>>42177477 #>>42177479 #>>42178108 #>>42179870 #>>42180214 #
bryanlarsen ◴[] No.42176567[source]
Anybody spending tens of billions annually on Nvidia hardware is going to be willing to spend millions to port their software away from CUDA.
replies(3): >>42176963 #>>42177463 #>>42182571 #
talldayo ◴[] No.42177463[source]
To slower hardware? What are they supposed to port to, ASICs?
replies(1): >>42177525 #
1. adgjlsfhk1 ◴[] No.42177525[source]
if the hardware is 30% slower and 2x cheaper, that's a pretty great deal.
replies(1): >>42177861 #
2. selectodude ◴[] No.42177861[source]
Power density tends to be the limiting factor for this stuff, not money. If it's 30 percent slower per watt, it's useless.
replies(1): >>42178459 #
3. Wytwwww ◴[] No.42178459[source]
The ratio between power usage and GPU cost is very, very different than with CPUs, though. If you could save e.g. 20-30% of the purchase price that might make it worth it.

e.g. you could run a H100 at 100% utilization 24/7 for 1 years at $0.4 per kWh (so assuming significant overhead for infrastructure etc.) and that would only cost ~10% of the purchase price of the GPU itself.

replies(1): >>42179046 #
4. wbl ◴[] No.42179046{3}[source]
Power usage cost isn't the money but the capacity and cooling.
replies(1): >>42181611 #
5. Wytwwww ◴[] No.42181611{4}[source]
Yes, I know that. Hence I quadrupled the price of electricity or are you saying that the cost of capacity and cooling doesn't scale directly with power usage?

We can increase that another 2x and the cost would still be relatively low compared to the price/deprecation of the GPU itself.