←back to thread

S1: A $6 R1 competitor?

(timkellogg.me)
851 points tkellogg | 1 comments | | HN request time: 0.235s | source
Show context
ttyprintk ◴[] No.42947690[source]
https://huggingface.co/simplescaling
replies(1): >>42948229 #
anentropic ◴[] No.42948229[source]
and: https://github.com/simplescaling/s1
replies(1): >>42948419 #
mettamage ◴[] No.42948419[source]
When you're only used to ollama, how do I go about using this model?
replies(1): >>42949013 #
davely ◴[] No.42949013[source]
I think we need to wait for someone to convert it into a GGUF file format.

However, once that happens, you can run it (and any GGUF model) from Hugging Face![0]

[0] https://huggingface.co/docs/hub/en/ollama

replies(2): >>42949696 #>>42950032 #
mettamage ◴[] No.42949696[source]
So this?

https://huggingface.co/brittlewis12/s1-32B-GGUF

replies(2): >>42950875 #>>42963726 #
1. mettamage ◴[] No.42963726[source]
I ran it, so far it seems like a pretty good model, especially locally.