←back to thread

206 points arbayi | 2 comments | | HN request time: 0s | source
Show context
poly2it ◴[] No.45141622[source]
How open-source are these glasses really? Are all software components compilable from source, or do they just publish an SDK Espressif-style?
replies(2): >>45142600 #>>45143236 #
alex1115alex ◴[] No.45143236[source]
It's all open-source:

https://github.com/Mentra-Community/MentraOS

replies(1): >>45143371 #
grokx ◴[] No.45143371[source]
Not really, despite the repo being named MentraOS, this repo seems to include only some mobile apps (that either run on a phone or on the glasses), some server code, and some SDKs. Mentra glasses are likely running on a fork of AOSP, which is not in this repo.
replies(1): >>45143591 #
SparkyMcUnicorn ◴[] No.45143591[source]
AOSP (or even a minimal fork) is way too heavy to be running on the glasses. It looks like the firmware is quite minimal and the "OS" is the app.

https://github.com/Mentra-Community/MentraOS/tree/main/mcu_c...

replies(3): >>45144063 #>>45144286 #>>45148079 #
alex1115alex ◴[] No.45144063{3}[source]
Mentra Live runs AOSP similar to the other AI glasses on the market (Ray-Ban, Xiaomi AI Glasses, RayNeo V3 AI Glasses, etc). It's heavy, but allows us to ship fast. You'll find this code in `asg_client` folder.

We're also working on a pair of HUD glasses that will release in 2026 using an NRF5340 MCU. The code for this is being developed in the `mcu_client` folder.

replies(2): >>45144406 #>>45153358 #
1. ENadyr ◴[] No.45153358{4}[source]
Please have an option for local processing. I would love to be able to use my locally running Gemma 3n model on my phone for low latency and for them to work without internet connectivity.
replies(1): >>45154417 #
2. alex1115alex ◴[] No.45154417[source]
We're going to be putting out a Mentra Edge SDK in the next few months, but it comes with some downsides. Using your phone as a compute device is a battery hog, and you can only connect one app to the glasses at a time.