←back to thread

623 points magicalhippo | 4 comments | | HN request time: 0.211s | source
1. treprinum ◴[] No.42620789[source]
Nvidia just did what Intel/AMD should have done to threaten CUDA ecosystem - release a "cheap" 128GB local inference appliance/GPU. Well done Nvidia, and it looks bleak for any AI Intel/AMD efforts in the future.
replies(2): >>42621142 #>>42621681 #
2. ◴[] No.42621142[source]
3. mft_ ◴[] No.42621681[source]
I think you nailed it. Any basic SWOT analysis of NVidia’s position would surely have to consider something like this from a competitor - either Apple, who is already nibbling around the edges of this space, or AMD/Intel who could/should? be.

It’s obviously not guaranteed to go this route, but an LLM (or similar) on every desk and in every home is a plausible vision of the future.

replies(1): >>42621833 #
4. iszomer ◴[] No.42621833[source]
Nvidia also brought Mediatek into the spotlight..