←back to thread

229 points modinfo | 2 comments | | HN request time: 0.405s | source
1. wonrax ◴[] No.40835203[source]
Can someone specialized in applied machine learning explain how this is useful? In my opinion, general-purpose models are only useful if they're large, as they are more capable and produce more accurate outputs for certain tasks. For on-device models, fine-tuned ones for specific tasks have greater precision with the same size.
replies(1): >>40837026 #
2. qeternity ◴[] No.40837026[source]
I think you may be extrapolating a ChatGPT-esque UX for what these on-device models will be used for. Think more along the lines of fuzzy regex, advanced autocomplete, generative UI, etc. Unlikely anybody will be having a long-form conversation with Gemini Nano.