←back to thread

623 points magicalhippo | 1 comments | | HN request time: 0s | source
Show context
Abishek_Muthian ◴[] No.42623030[source]
I'm looking at my Jetson Nano in the corner which is fulfilling its post-retirement role as a paper weight because Nvidia abandoned it in 4 years.

Nvidia Jetson Nano, A SBC for "AI" debuted with already aging custom Ubuntu 18.04 and when 18.04 went EOL, Nvidia abandoned it completely without any further updates to its proprietary jet-pack or drivers and without them all of Machine Learning stack like CUDA, Pytorch etc. became useless.

I'll never buy a SBC from Nvidia unless all the SW support is up-streamed to Linux kernel.

replies(8): >>42623475 #>>42623488 #>>42623818 #>>42624449 #>>42624698 #>>42624923 #>>42625236 #>>42625420 #
tcdent ◴[] No.42624698[source]
If you're expecting this device to stay relevant for 4 years you are not the target demographic.

Compute is evolving way too rapidly to be setting-and-forgetting anything at the moment.

replies(2): >>42624846 #>>42625088 #
mrybczyn ◴[] No.42624846[source]
Eh? By all indications compute is now evolving SLOWER than ever. Moore's Law is dead, Dennard scaling is over, the latest fab nodes are evolutionary rather than revolutionary.

This isn't the 80s when compute doubled every 9 months, mostly on clock scaling.

replies(2): >>42625237 #>>42627229 #
1. sliken ◴[] No.42627229[source]
Indeed, generational improvements are at an all time low. Most of the "revolutionary" AI and/or GPU improvements are less precision (fp32 -> fp16 -> fp8 -> fp4) or adding ever more fake pixels, fake frames, and now in the most recent iteration multiple fake frames per computed frame.

I believe Nvidia has some published numbers for the 5000 series that showed DLSS off performance, which allowed a fair comparison to the previous generation, on the order of 25%, then removed it.

Thankfully the 3rd party benchmarks that use the same settings on old and new hardware should be out soon.