Reminder that Vision Pro has a dedicated R1 chip, with a blistering 256GB/s memory (with the actual cpu "only" having 153GB/s)! That's as much as the quad-channel memory LPDDR5x Strix Halo!
It'll be interesting to see how Samsung & then others fair at this, and over time to see how much Google, Qualcomm or other platform providers help versus leave device makers to fend for themselves at sensor fusion and other ultra realtime tasks here. Whether the Snapdragon XR2+ Gen 2 here can do enough, and whether the software can make decent use of that hardware is so TBD for this new ecosystem. It's not super super clear who is leading the charge to make it all slick and smooth. My default assumption is Qualcomm likely holds a big chunk of the stack, and sole-proprietorship of the stack like that seems like a real threat to long-term viability of XR as a technology: like the Valve Steam Deck so strongly exhibited, it's only through intense cross-stack ownership and close collaboration (in the Linux kernel in this case) that we see genuinely good products emerge.
Sensors, from Samsung's specs page:
Two High-resolution Pass-through cameras
Six World-facing tracking cameras
Four Eye-tracking Cameras
Five Inertial Measurement Units(IMUs) [commentary: whoaaa, thats a lot]
One Depth sensor
One Flicker sensor
As an aside, this sort of makes me want a device that just does eye tracking. That there are four eye tracking cameras here seems wild! I've mostly seem some pretty chill examples of webcam based tracking; it'd be neat to see what kind of user interface we could build if we really could see where people are looking.Also maybe worth reviewing what Android ARCore offers, as this defines so much of what we get here. I'd love to see more depth-based capture systems about in general: not just on the XR displays but on regular devices too! To build a better library of depth-having media. Apple's had LiDAR since iPhone 12 Pro (2020)! There's some ToF on Android phones but close to zero lidar. We also see tons of big fancy dual-sensor XR cameras out there, but AFAIK nothing for phones! Just adding a second stereoscoping camera on the back of phones would be so obvious, & do so much to help the XR world! It feels like XR products are being left to stand all on their own with no help from the rest of the mobile device ecosystem, and it feels so obvious & unworkable.