←back to thread

397 points Anon84 | 1 comments | | HN request time: 0.348s | source
Show context
mark_l_watson ◴[] No.45126243[source]
I pay to use ProtonMail’s privacy preserving Lumo LLM Chat with good web_search tooling. Lumo is powered by Mistral models.

I use Lumo a lot and usually results are good enough. To be clear though, I do fall back on gemini-cli and OpenAI’s codex systems for coding a few times a week.

I live in the US, but if I were a European, I would be all in on supporting Mistral. Strengthen your own country and region.

replies(3): >>45127520 #>>45131854 #>>45132924 #
g-mork ◴[] No.45127520[source]
I wonder what ProtonMail are doing internally? Mistral's public API endpoints route via CloudFlare, just like apparently every other hosted LLM out there, even any of the Chinese models I've checked
replies(3): >>45129285 #>>45133152 #>>45133251 #
1. ac29 ◴[] No.45133152[source]
Mistral small and large are open weight, so they are likely self hosting?