Most active commenters
  • ryao(3)
  • moffkalast(3)

←back to thread

623 points magicalhippo | 15 comments | | HN request time: 1.56s | source | bottom
1. ryao ◴[] No.42619962[source]
This looks like a successor to the Nvidia Jetson AGX Orin 64GB Developer Kit:

https://www.okdo.com/wp-content/uploads/2023/03/jetson-agx-o...

I wonder what the specifications are in terms of memory bandwidth and computational capability.

replies(3): >>42619979 #>>42620718 #>>42622001 #
2. kcb ◴[] No.42619979[source]
Hopefully, the OS support isn't as awful as the Jetson platforms usually are. Unless they change, you'll get 1 or 2 major kernel updates ever and have to do bizarre stuff like install a 6 year old Ubuntu on your x86 PC to run the utility to flash the OS.
replies(1): >>42620030 #
3. ryao ◴[] No.42620030[source]
The community likely will make instructions for installing mainstream Linux distributions on it.
replies(1): >>42620055 #
4. kcb ◴[] No.42620055{3}[source]
Doesn't really help though if it requires an nvidia kernel.
replies(2): >>42620082 #>>42620257 #
5. ryao ◴[] No.42620082{4}[source]
The Linux kernel license requires Nvidia to disclose their Linux kernel sources and Nvidia open sourced their kernel driver.

That said, you can probably boot a Debian or Gentoo system using the Nvidia provided kernel if need be.

replies(1): >>42620677 #
6. snerbles ◴[] No.42620257{4}[source]
The official Linux kernel driver for Blackwell is GPL/MIT licensed: https://developer.nvidia.com/blog/nvidia-transitions-fully-t...
replies(1): >>42620462 #
7. sliken ◴[] No.42620462{5}[source]
Keep in mind that a kernel module != driver. It's just doing initialization and passing data to/from the driver, which is closed source and in user space.
8. bionade24 ◴[] No.42620677{5}[source]
It always has been the userspace of the Jetsons which was closed source and tied to Nvidia's custom kernel. I have not heard from people running Jetpack on a different userland than the one provided by Nvidia. Companies/Labs that update the OS don't care about CUDA, Nvidia contributes to Mesa support of the Jetsons and some only need a bit more GPU power than a RasPi.
9. moffkalast ◴[] No.42620718[source]
The AGX Orin was only 64GB of LPDDR5 and priced at $5k so this does seem like a bargain in comparison with 128GB of presumably HBM. But Nvidia never lowers their prices, so there's a caveat somewhere.
replies(1): >>42621056 #
10. fulafel ◴[] No.42621056[source]
The memory is LPDDR accordning to the specs graphic on the NV product page: https://www.nvidia.com/en-us/project-digits/

Anyone willing to guess how wide?

replies(1): >>42621211 #
11. moffkalast ◴[] No.42621211{3}[source]
I've seen some claims that it can do 512 GB/s on Reddit (not sure where they got that from), which would imply a ~300 bit bus with LPDDR5X depending on the frequency.
replies(1): >>42622947 #
12. zamadatix ◴[] No.42622001[source]
The Jetson Orin Dev Kit is squarely aimed at being a dev kit for those using the Jetson module in production edge compute (robotic vision and the like). The only reason it's so well known in tech circles is "SBC syndrome" where people get excited about what they think they could do with it and then 95% end up in a drawer a year later because it what it's actually good at is unrelated to why they bought it.

This is more accurately a descendant of the HPC variants like the article talks about - intentionally meant to actually be a useful entry level for those wanting to do or run general AI work better than a random PC would have anyways.

13. pella ◴[] No.42622947{4}[source]
probably:

"According to the Grace Blackwell's datasheet- Up to 480 gigabytes (GB) of LPDDR5X memory with up to 512GB/s of memory bandwidth. It also says it comes in a 120 gb config that does have the full fat 512 GB/s."

via https://www.reddit.com/r/LocalLLaMA/comments/1hvj1f4/comment...

"up to 512GB/s of memory bandwidth per Grace CPU"

https://resources.nvidia.com/en-us-data-center-overview/hpc-...

replies(2): >>42624265 #>>42625773 #
14. moffkalast ◴[] No.42624265{5}[source]
Yep I think that's it. So it's referencing the GB200, it could have absolutely nothing in common with this low power version.
15. sliken ◴[] No.42625773{5}[source]
Keep in mind the "full" grace is a completely different beast with Neoverse cores. This new GB10 uses different cores and might well have a different memory interface. I believe the "120 GB" config includes ECC overhead (which is inline on Nvidia GPUs) and Neoverse cores have various tweaks for larger configurations that are absent in the Cortex-x925.

I'd be happy to be wrong, but I don't see anything from Nvidia that implies a 512 bit wide memory interface on the Nvidia Project DIgits.