←back to thread

S1: A $6 R1 competitor?

(timkellogg.me)
851 points tkellogg | 1 comments | | HN request time: 0.302s | source
Show context
ttyprintk ◴[] No.42947690[source]
https://huggingface.co/simplescaling
replies(1): >>42948229 #
anentropic ◴[] No.42948229[source]
and: https://github.com/simplescaling/s1
replies(1): >>42948419 #
mettamage ◴[] No.42948419[source]
When you're only used to ollama, how do I go about using this model?
replies(1): >>42949013 #
davely ◴[] No.42949013[source]
I think we need to wait for someone to convert it into a GGUF file format.

However, once that happens, you can run it (and any GGUF model) from Hugging Face![0]

[0] https://huggingface.co/docs/hub/en/ollama

replies(2): >>42949696 #>>42950032 #
1. fl0id ◴[] No.42950032[source]
you can load the safetensors with ollama, you just have to provide a modelfile. or wait for someone to do it. It will in theory also quantize it for you, as I guess most ppl cannot load a 129 GB model...