←back to thread

246 points doener | 1 comments | | HN request time: 0.204s | source
Show context
JKolios ◴[] No.43691554[source]
More diversity in the LLM space is always good. In my experience though, speaking as a native speaker of one of the less-used European languages, Mistral's models already use it pretty well.
replies(3): >>43691665 #>>43691771 #>>43695687 #
1. Etheryte ◴[] No.43691665[source]
As a native of another small European language, no state of the art model comes anywhere close to not being laughably bad, so more work in this space is definitely welcomed as far as I'm concerned.