←back to thread

684 points prettyblocks | 1 comments | | HN request time: 0s | source

I mean anything in the 0.5B-3B range that's available on Ollama (for example). Have you built any cool tooling that uses these models as part of your work flow?
Show context
azhenley ◴[] No.42785041[source]
Microsoft published a paper on their FLAME model (60M parameters) for Excel formula repair/completion which outperformed much larger models (>100B parameters).

https://arxiv.org/abs/2301.13779

replies(4): >>42785270 #>>42785415 #>>42785673 #>>42788633 #
3abiton ◴[] No.42785673[source]
But I feel we're going back full circle. These small models are not generalist, thus not really LLMs at least in terms of objective. Recently there has been a rise of "specialized" models that provide lots of values, but that's not why we were sold on LLMs.
replies(3): >>42785764 #>>42786287 #>>42786397 #
colechristensen ◴[] No.42785764[source]
But that's the thing, I don't need my ML model to be able to write me a sonnet about the history of beets, especially if I want to run it at home for specific tasks like as a programming assistant.

I'm fine with and prefer specialist models in most cases.

replies(1): >>42786703 #
zeroCalories ◴[] No.42786703[source]
I would love a model that knows SQL really well so I don't need to remember all the small details of the language. Beyond that, I don't see why the transformer architecture can't be applied to any problem that needs to predict sequences.
replies(1): >>42787370 #
1. dr_kiszonka ◴[] No.42787370[source]
The trick is to find such problems with enough training data and some market potential. I am terrible at it.