However, once that happens, you can run it (and any GGUF model) from Hugging Face![0]
[0] https://huggingface.co/docs/hub/en/ollama