←back to thread

Basic Facts about GPUs

(damek.github.io)
338 points ibobev | 1 comments | | HN request time: 0s | source
Show context
elashri ◴[] No.44366911[source]
Good article summarizing good chunk of information that people should have some idea about. I just want to comment that the title is a little bit misleading because this is talking about the very choices that NVIDIA follows in developing their GPU archs which is not what always what others do.

For example, the arithmetic intensity break-even point (ridge-point) is very different once you leave the NVIDIA-land. If we take AMD Instinct MI300, it has up to 160 TFLOPS FP32 paired with ~6 TB/s of HBM3/3E bandwidth gives a ridge-point near 27 FLOPs/byte which is about double that of the A100’s 13 FLOPs/byte. The larger on-package HBM (128 – 256 GB) GPU memory also shifts the practical trade-offs between tiling depth and occupancy. Although this is very expensive and does not have CUDA (which can be good and bad at the same time).

replies(2): >>44367014 #>>44380929 #
apitman ◴[] No.44367014[source]
Unfortunately Nvidia GPUs are the only ones that matter until AMD starts taking their computer software seriously.
replies(2): >>44367150 #>>44368272 #
tucnak ◴[] No.44368272[source]
Unfortunately, GPU's are old news now. When it comes to perf/watt/dollar, TPU's are substantially ahead for both training and inference. There's a sparsity disadvantage with the trailing-edge TPU devices such as v4 but if you care about large-scale training of any sort, it's not even close. Additionally, Tenstorrent p300 devices are hitting the market soon enough, and there's lots of promising stuff is coming on Xilinx side of the AMD shop: the recent Versal chips allow for AI compute-in-network capabilities that puts NVIDIA Bluefield's supposed programmability to shame. NVIDIA likes to say Bluefield is like a next-generation SmartNIC, but compared to actually field-programmable Versal stuff, it's more like 100BASE-T cards from the 90s.

I think it's very naive to assume that GPU's will continue to dominate the AI landscape.

replies(2): >>44369832 #>>44370305 #
almostgotcaught ◴[] No.44370305[source]
> Unfortunately, GPU's are old news now

...

> the recent Versal chips allow for AI compute-in-network capabilities that puts NVIDIA Bluefield's supposed programmability to shame

I'm always just like... who are you people. Like what is the profile of a person that just goes around proclaiming wild things as if they're completely established. And I see this kind of comment on hn very frequently. Like you either work for Tenstorrent or you're an influencer or a zdnet presenter or just ... because none of this even remotely true.

Reminds me of

"My father would womanize; he would drink. He would make outrageous claims like he invented the question mark. Sometimes, he would accuse chestnuts of being lazy."

> I think it's very naive to assume that GPU's will continue to dominate the AI landscape

I'm just curious - how much of your portfolio is AMD and how much is NVDA and how much is GOOG?

replies(2): >>44370454 #>>44371227 #
1. timeinput ◴[] No.44370454[source]
Listen, I'm ~~not~~ all in on Ferrero Rocher, and chestnuts *are* lazy. No where near as productive as hazelnuts.