←back to thread

Docker Model Runner

(www.docker.com)
100 points kordlessagain | 4 comments | | HN request time: 1.185s | source
Show context
rockwotj ◴[] No.43677187[source]
Looks exactly like ollama but built into Docker desktop? Anyone know of any differences?
replies(4): >>43677209 #>>43677230 #>>43677457 #>>43678593 #
1. ammo1662 ◴[] No.43677457[source]
They are using OCI artifacts to package models, so you can use your own registry to host these models internally. However, I just can't see any improvement comparing with a simple FTP server. I don't think the LLM models can adopt hierarchical structures like Docker, and thus cannot leverage the benefits of layered file systems, such as caching and reuse.
replies(2): >>43677720 #>>43680735 #
2. remram ◴[] No.43677720[source]
I think ollama uses OCI too? At least it's trying to. https://github.com/ollama/ollama/issues/914#issuecomment-195...
replies(1): >>43681007 #
3. jesserwilliams ◴[] No.43680735[source]
It's not the only one using OCI to package models. There's a CNCF project called KitOps (https://kitops.org) that has been around for quite a bit longer. It solves some of the limitations that using Docker has, one of those being that you don't have to pull the entire project when you want to work on it. Instead, you can pull just the data set, tuning, model, etc.
4. hobofan ◴[] No.43681007[source]
Yes, ollama also uses OCI, but currently only works with unauthenticated registries.