←back to thread

577 points simonw | 4 comments | | HN request time: 0s | source
1. ygritte ◴[] No.44731271[source]
Can you host that model locally with ollama?
replies(1): >>44731333 #
2. simonw ◴[] No.44731333[source]
I haven't seen a GGUF for it yet, I imagine one will show up on Hugging Face soon which will probably work with Ollama.
replies(1): >>44731882 #
3. pyman ◴[] No.44731882[source]
Do you think local LLMs combined with P2P networks could become a thing? Imagine people adding datasets to an open model, the same way they add blocks to a blockchain, which is around 500GB in size.

It could help decentralise power and reduce our dependency on the big players.

replies(1): >>44733750 #
4. simonw ◴[] No.44733750{3}[source]
There have been ambitions to do that kind of thing with LoRA - see the leaked "no moat" Google memo from a couple of years ago for one example: https://simonwillison.net/2023/May/4/no-moat/

It hasn't really happened though. I suspect that's because it turns out techniques like RAG or tool calling are massively easier and more effective than trying to tech models new information through shared model weights.