←back to thread

343 points LorenDB | 1 comments | | HN request time: 0.267s | source
Show context
tommica ◴[] No.44002018[source]
Sidetangent: why is ollama frowned upon by some people? I've never really got any other explanation than "you should run llama.CPP yourself"
replies(9): >>44002029 #>>44002150 #>>44002166 #>>44002486 #>>44002513 #>>44002621 #>>44004218 #>>44005337 #>>44006200 #
1. speedgoose ◴[] No.44002486[source]
To me, Ollama is a bit the Docker of LLMs. The user experience is inspired and the model file syntax is also inspired by the Dockerfile syntax. [0]

In the early days of Docker, we had the debate of Docker vs LXC. At the time, Docker was mostly a wrapper over LXC and people were dismissing the great user experience improvements of Docker.

I agree however that the lack of acknowledgement to llama.cpp for a long time has been problematic. They acknowledge the project now.

[0]: https://github.com/ollama/ollama/blob/main/docs/modelfile.md