←back to thread

282 points elsewhen | 1 comments | | HN request time: 0.207s | source
Show context
miki123211 ◴[] No.41912058[source]
It feels like mainstream technology is slowly replacing most if not all accessibility devices, and I think that's a really good thing for those who need them!

This already happened to blind people. We used to have color testers, specialized audiobook / ebook players, GPS devices, text scanning / OCR machines, devices for banknote recognition, barcode readers, talking scales, thermometers, blood pressure meters and so on, all as separate devices, all extremely expensive. Nobody really had or carried all of these at once, though most people had at least some, it was just too expensive and impractical.

Nowadays, while those devices still exist, all you really need is any smartphone (even a low-end Android will do, though iOS is much better for this use case IMO), a free screen reader, which both OSes include by default, and a couple of free / cheap apps. Things like talking scales can be replaced with accessories connected over Bluetooth that don't technically talk, but that expose the measurements to your smartphone screen reader.

replies(3): >>41912290 #>>41912836 #>>41914711 #
1. jillesvangurp ◴[] No.41912290[source]
Interesting perspective. I never thought about this properly. But it seems things like Google Lens and other machine vision apps paired with a good screen reader can help blind people to "see".

I did some recent experiments with the openai api recently to see if I could make sense of photos by classifying and describing things. That worked surprisingly well for the absolute minimal effort I put in (<30 minutes) and I've been meaning to follow up on that to properly turn that in a product feature in our app.

Anyway, something simple hooked up to a camera shouldn't take that long to code. There might be good enough locally running models for machine vision as well. Reading signs, menus, describing what's in front of you, etc. I bet that there is some low hanging fruit there for visually impaired people that are a bit handy with programming in terms of really useful apps that they could develop with this.

Another aspect of using consumer tech like this is that it's normal. People wearing airpods don't stand out as hearing impaired or special. Most hearing aids on the other hand are clearly recognizable as such. I imagine some people don't like wearing them for that reason. They are kind of ugly, generally. Unlike e.g. glasses, there's no such thing as designer hearing aids. They are kind of a necessary evil for people. Apple is being clever here by tapping into a market of aging but wealthy people with a taste for good stuff.