←back to thread

51 points walterbell | 1 comments | | HN request time: 0.236s | source
Show context
walterbell ◴[] No.43113243[source]
Meta has Project Aria for research, https://news.ycombinator.com/item?id=43066927, but no public SDK. Devices dedicated to vision-impaired users are north of $2000. Hopefully Envision Companion (https://www.letsenvision.com/companion) software can be ported to $300 Meta smart glasses or the growing list of open glass hardware that is compatible with OSS AugmentOS.

This thread lead me to Mentra's MIT-licensed https://github.com/AugmentOS-Community/AugmentOS & https://augmentos.org/ for Android and (soon) iOS:

> Smart glasses OS, with dozens of built-in apps. Users get AI assistant, notifications, translation, screen mirror, captions, and more. Devs get to write 1 app that runs on any pair of smart glases.

Where "any" means this HCL:

  Vuzix Z100 ($500)
  Mentra.glass Mach1 ($350) or Live ($220)
  Even Realities G1 ($600)
  (future) Meizu StarV
Apple iPhones have state-of-the-art hardware (lidar, UWB precise positioning) that could help millions of visually impaired humans, but iPhone hardware has been limited by amberware (software frozen with minimal updates). Apple poured billions into now-cancelled money pits like Apple Car and VisionOS, while teams failed forward into AI, smart glasses and humanoid robots. Meanwhile, Meta smart glasses are S-curving from 2M to 10M nodes of data acquisition, https://news.ycombinator.com/item?id=43088369

On paper, Apple Magnifier with Live Descriptions audio could justify the purchase of an iPhone Pro for dedicated single-app usage, https://support.apple.com/guide/iphone/live-descriptions-vis.... But while it works for short demos, the software is not reliable for continuous use. UWB AirTags and Lidar 3D imaging could enable precise indoor navigation for vision impaired users, but 5+ years of shipping hardware has not lead to usable software workflows.

The economic tragedy is that 95% of the technology for helping vision/cognition impaired humans could be repurposed for humanoid robots, with R&D bonus that humans can provide more real-world feedback (RLHF!) than silent lab robots. Until Apple breaks vision stasis with new technical leadership, or Apple/Meta are regulated by the EU to unlock reluctant open-glass-platform innovation, the only hackable option is open glass hardware for future BigTech sherlocking or acquisition.

replies(1): >>43113733 #
unsupp0rted ◴[] No.43113733[source]
Disparaging these companies for making products people want while writing angry posts that these companies aren't doing what you want.

Apple and Meta should use their amazing technology for accessibility. And maybe they are or will. But they owe me nothing.

If I don't want their products I don't have to buy them. If I don't want their stock I don't have to invest in it.

replies(3): >>43113873 #>>43113896 #>>43114649 #
1. georgemcbay ◴[] No.43114649[source]
> Disparaging these companies for making products people want while writing angry posts that these companies aren't doing what you want.

I'm not seeing the "angry posts" you are referencing here. Neither the HN post you're replying to nor the linked post come off as angry to me. Disappointed, perhaps.

And that aside... NOT disparaging these companies certainly isn't getting them to do the right thing. Informing others who might not otherwise be in a situation where they are aware of these deficiencies seems to me the most likely way to exert subtle pressure to get these companies to do the right thing eventually.