←back to thread

623 points magicalhippo | 2 comments | | HN request time: 1.765s | source
Show context
treprinum ◴[] No.42620789[source]
Nvidia just did what Intel/AMD should have done to threaten CUDA ecosystem - release a "cheap" 128GB local inference appliance/GPU. Well done Nvidia, and it looks bleak for any AI Intel/AMD efforts in the future.
replies(2): >>42621142 #>>42621681 #
1. mft_ ◴[] No.42621681[source]
I think you nailed it. Any basic SWOT analysis of NVidia’s position would surely have to consider something like this from a competitor - either Apple, who is already nibbling around the edges of this space, or AMD/Intel who could/should? be.

It’s obviously not guaranteed to go this route, but an LLM (or similar) on every desk and in every home is a plausible vision of the future.

replies(1): >>42621833 #
2. iszomer ◴[] No.42621833[source]
Nvidia also brought Mediatek into the spotlight..