←back to thread

602 points emrah | 6 comments | | HN request time: 0.001s | source | bottom
1. jarbus ◴[] No.43743656[source]
Very excited to see these kinds of techniques, I think getting a 30B level reasoning model usable on consumer hardware is going to be a game changer, especially if it uses less power.
replies(1): >>43743674 #
2. apples_oranges ◴[] No.43743674[source]
Deepseek does reasoning on my home Linux pc but not sure how power hungry it is
replies(1): >>43743696 #
3. gcr ◴[] No.43743696[source]
what variant? I’d considered DeepSeek far too large for any consumer GPUs
replies(1): >>43743721 #
4. scosman ◴[] No.43743721{3}[source]
Some people run Deepseek on CPU. 37B active params - it isn't fast but it's passible.
replies(1): >>43743842 #
5. danielbln ◴[] No.43743842{4}[source]
Actual deepseek or some qwen/llama reasoning fine-tune?
replies(1): >>43744550 #
6. scosman ◴[] No.43744550{5}[source]
Actual Deepseek. 500gb of memory and a threadripper works. Not a standard PC spec, but a common ish home brew setup for single user Deepseek.