←back to thread

397 points Anon84 | 1 comments | | HN request time: 0.206s | source
Show context
mark_l_watson ◴[] No.45126243[source]
I pay to use ProtonMail’s privacy preserving Lumo LLM Chat with good web_search tooling. Lumo is powered by Mistral models.

I use Lumo a lot and usually results are good enough. To be clear though, I do fall back on gemini-cli and OpenAI’s codex systems for coding a few times a week.

I live in the US, but if I were a European, I would be all in on supporting Mistral. Strengthen your own country and region.

replies(3): >>45127520 #>>45131854 #>>45132924 #
g-mork ◴[] No.45127520[source]
I wonder what ProtonMail are doing internally? Mistral's public API endpoints route via CloudFlare, just like apparently every other hosted LLM out there, even any of the Chinese models I've checked
replies(3): >>45129285 #>>45133152 #>>45133251 #
1. fauigerzigerk ◴[] No.45129285[source]
>I live in the US, but if I were a European, I would be all in on supporting Mistral. Strengthen your own country and region

That's a bit of a double edged sword. My support goes as far as giving local offerings a try when I might not have done otherwise. But at that point they need to be able to compete on merit.