←back to thread

509 points nullpxl | 1 comments | | HN request time: 0.201s | source

Hi! Recently smart-glasses with cameras like the Meta Ray-bans seem to be getting more popular. As does some people's desire to remove/cover up the recording indicator LED. I wanted to see if there's a way to detect when people are recording with these types of glasses, so a little bit ago I started working this project. I've hit a little bit of a wall though so I'm very much open to ideas!

I've written a bunch more on the link (+photos are there), but essentially this uses 2 fingerprinting approaches: - retro-reflectivity of the camera sensor by looking at IR reflections. mixed results here. - wireless traffic (primarily BLE, also looking into BTC and wifi)

For the latter, I'm currently just using an ESP32, and I can consistently detect when the Meta Raybans are 1) pairing, 2) first powered on, 3) (less consistently) when they're taken out of the charging case. When they do detect something, it plays a little jingle next to your ear.

Ideally I want to be able to detect them when they're in use, and not just at boot. I've come across the nRF52840, which seems like it can follow directed BLE traffic beyond the initial broadcast, but from my understanding it would still need to catch the first CONNECT_REQ event regardless. On the bluetooth classic side of things, all the hardware looks really expensive! Any ideas are appreciated. Thanks!

Show context
9dev ◴[] No.46076725[source]
Does anyone work on smart glasses for blind people yet? Something with blackened glass, obviously, that uses image recognition to translate visual input into text via (headphone) audio to the wearer.

That would allow for urgent warnings (approaching a street, walking towards obstacle [say, an electric scooter or a fence]), scene descriptions on request, or help finding things in the view field. There's probably a lot more you could do with this to help improve quality of life for fully blind people.

replies(7): >>46076892 #>>46076964 #>>46077061 #>>46077350 #>>46079607 #>>46079733 #>>46081469 #
1. aprilnya ◴[] No.46077350[source]
I’ve heard stories of people using the Meta smart glasses to help with reduced vision, i.e. asking the LLM assistant what you’re looking at, asking it to read a label, etc. The LLM assistant can see the camera feed so it is capable of doing that.

However things like the urgent warnings you mentioned don’t exist yet.

Hearing about the way people with bad vision use these glasses kind of changed my viewpoint on them to be honest; for the average person it might seem useless to be able to ask an LLM about what you’re looking at, but looking at it from an accessibility standpoint it seems like a really good idea.