←back to thread

202 points Jabrov | 1 comments | | HN request time: 0.193s | source
Show context
levifig ◴[] No.44005555[source]
FWIW, llama.cpp links to and fetches models from ollama (https://github.com/ggml-org/llama.cpp/blob/master/tools/run/...).

This issue seems to be the typical case of someone being bothered for someone else, because it implies there's no "recognition of source material" when there's quite a bit of symbiosis between the projects.

replies(5): >>44005635 #>>44006351 #>>44006452 #>>44006985 #>>44007804 #
diggan ◴[] No.44006452[source]
Well, llama.cpp supports fetching models from a bunch of different sources according to that file, Hugging Face, ModelScope, Ollama, any HTTP/local source. Seems fair to say they've added support for any source one most likely will find the LLM model you're looking for at.

Not sure I'd say there is "symbiosis" between ModelScope and llama.cpp just because you could download models from there via llama.cpp, just like you wouldn't say there is symbiosis between LM Studio and Hugging Face, or even more fun example: YouTube <> youtube-dl/yt-dlp.

replies(1): >>44008372 #
1. gopher_space ◴[] No.44008372[source]
Symbiosis states that a relationship exists. Subcategories of symbiosis state how useful that relationship is to either party, and they're determined by the observer rather than the organisms involved.