←back to thread

216 points veggieroll | 1 comments | | HN request time: 0.205s | source
Show context
xnx ◴[] No.41860534[source]
Has anyone put together a good and regularly updated decision tree for what model to use in different circumstances (VRAM limitations, relative strengths, licensing, etc.)? Given the enormous zoo of models in circulation, there must be certain models that are totally obsolete.
replies(3): >>41860656 #>>41860757 #>>41861253 #
1. mark_l_watson ◴[] No.41861253[source]
I tend to choose a recent model available for Ollama, and usually stick with a general purpose local model for a month or so, then re-evaluate. Exceptions to sticking to one local model at a time might be needing a larger context size.