You had me at "Browser compatibility".
replies(2):
I assume every browser will do the same as on-device models start becoming more useful.
You can use llm for this fairly easily:
uv tool install llm
# Set up your model however you like. For instance:
llm install llm-ollama
ollama pull mistral-small3.2
llm --model mistral-small3.2 --system "Translate to English, no other output" --save english
alias english="llm --template english"
english "Bonjour"
english "Hola"
english "Γειά σου"
english "你好"
cat some_file.txt | english
https://llm.datasette.ioPlus, mistral-small3.2 has too many parameters. Not all devices can run it fast. That probably isn't the exact translation model being used by Chrome.
https://github.com/facebookresearch/fairseq/tree/nllb/
If running locally is too difficult, you can use llm to access hosted models too.