their huge customers will be able to produce ASICs hat will be faster and cheaper to operate than their GPUs
jensen has to be the luckiest man in the world, first crypto, now "AI"
their huge customers will be able to produce ASICs hat will be faster and cheaper to operate than their GPUs
jensen has to be the luckiest man in the world, first crypto, now "AI"
Are we sure this will be the case? Perhaps the sweet spot for hardware that can train/run language models is the GPU already, especially with the years of head start Nvidia has?
BTW, stock price is not everything, Cisco survived, grew, and it's the backbone of internet today.
Why? NVIDIA is better positioned to produce faster and more efficient ML ASICs any of their huge customers (except possibly Google). And on top of that, the fact that there is a huge library of CUDA code that will run out of the box on NVIDIA hardware is a big advantage.
Arguably, this shift has already happened. Modern NVIDIA datacenter GPUs, like the H100, only bear a passing resemblance to a GPU -- most of the silicon is dedicated to accelerating ML workloads.
As a possibility for example I can see them transforming from a GPU based corp into a parent company for many full or partially owned "subsidiaries". They still manufacture chips to be "vertically integrated" but that becomes bread and butter as an enablement rather than the main story (e.g. Google TPU's). As their margins go down the value accrues to what they are owning (the business units/product areas).