←back to thread

112 points favoboa | 1 comments | | HN request time: 0.22s | source
Show context
janpmz ◴[] No.44431167[source]
One could start with a large model for exploration during development, and then distill it down to a small model that covers the variety of the task and fits on a USB drive. E.g. when I use a model for gardening purposes, I could prune knowledge about other topics.
replies(2): >>44431201 #>>44432306 #
1. dotancohen ◴[] No.44432306[source]
In what sense would you need an LLM while gardening for? I'm imagining for problem solving, like asking "what worm looks like a small horse hair". But that would require the LLM to know what a horse hair is. In other words, not a distilled model, but rather a model that contains pretty much anything our gardener's imagination will make analogies out of.