Very excited to see these kinds of techniques, I think getting a 30B level reasoning model usable on consumer hardware is going to be a game changer, especially if it uses less power.
Deepseek does reasoning on my home Linux pc but not sure how power hungry it is
what variant? I’d considered DeepSeek far too large for any consumer GPUs
Some people run Deepseek on CPU. 37B active params - it isn't fast but it's passible.