https://ollama.com/library/gemma3
> support for over 140 languages
You can use llm for this fairly easily:
uv tool install llm
# Set up your model however you like. For instance:
llm install llm-ollama
ollama pull mistral-small3.2
llm --model mistral-small3.2 --system "Translate to English, no other output" --save english
alias english="llm --template english"
english "Bonjour"
english "Hola"
english "Γειά σου"
english "你好"
cat some_file.txt | english
https://llm.datasette.ioPlus, mistral-small3.2 has too many parameters. Not all devices can run it fast. That probably isn't the exact translation model being used by Chrome.
https://github.com/facebookresearch/fairseq/tree/nllb/
If running locally is too difficult, you can use llm to access hosted models too.
Not the easiest, but easy enough (requires building).
I used these two projects to build an on-device translator for Android.
You could also look into Argos Translate, or just use the same models as Firefox through kotki [4].
[0] https://huggingface.co/facebook/nllb-200-distilled-600M [1] https://huggingface.co/facebook/m2m100_418M [2] https://huggingface.co/google/madlad400-3b-mt [3] https://huggingface.co/models?other=base_model:quantized:goo... [4] https://github.com/kroketio/kotki