Sidetangent: why is ollama frowned upon by some people? I've never really got any other explanation than "you should run llama.CPP yourself"
replies(9):
You have to support Vulkan if you care about consumer hardware. Ollama devs clearly don't.
Ollama makes this trivial compared to llama.cpp, and so for me adds a lot of value due to this.