←back to thread

283 points walterbell | 1 comments | | HN request time: 0.23s | source
1. moffkalast ◴[] No.45770055[source]
> Memory support is another highlight: the chip integrates a 128-bit LPDDR5X-9600 controller and will reportedly include 16 GB of onboard RAM, aligning with current trends in unified memory designs used in ARM SoCs. Additionally, the APU carries AMD’s fourth-generation AI engine, enabling on-device inference tasks

128-bit LPDDR5X-9600 is about 150 GB/s, that's 50% better than an Orin NX. If they can sell these things for less than like $500 then it would be a pretty decent deal for edge inference. 16 GB is ridiculously tiny for the use case though when it's actually more like 15 in practice and the OS and other stuff then takes another two or three, leaving you with like 12 maybe. Hopefully there's a 32 GB model eventually...