←back to thread

216 points veggieroll | 1 comments | | HN request time: 0s | source
Show context
xnx ◴[] No.41860534[source]
Has anyone put together a good and regularly updated decision tree for what model to use in different circumstances (VRAM limitations, relative strengths, licensing, etc.)? Given the enormous zoo of models in circulation, there must be certain models that are totally obsolete.
replies(3): >>41860656 #>>41860757 #>>41861253 #
leetharris ◴[] No.41860656[source]
People keep making these, but they become outdated so fast and nobody keeps up with it. If your definition of "great" changes in 6 months because a new model shatters your perception of "great," it's hard to rescore legacy models.

I'd say keeping up with the reddit LocalLLama community is the "easiest" way and it's by no means easy.

replies(2): >>41861614 #>>41868121 #
1. potatoman22 ◴[] No.41861614[source]
Someone should use an LLM to continuously maintain this decision tree. The tree itself will decide which LLM is used for maintainence.