←back to thread

326 points threeturn | 3 comments | | HN request time: 0s | source

Dear Hackers, I’m interested in your real-world workflows for using open-source LLMs and open-source coding assistants on your laptop (not just cloud/enterprise SaaS). Specifically:

Which model(s) are you running (e.g., Ollama, LM Studio, or others) and which open-source coding assistant/integration (for example, a VS Code plugin) you’re using?

What laptop hardware do you have (CPU, GPU/NPU, memory, whether discrete GPU or integrated, OS) and how it performs for your workflow?

What kinds of tasks you use it for (code completion, refactoring, debugging, code review) and how reliable it is (what works well / where it falls short).

I'm conducting my own investigation, which I will be happy to share as well when over.

Thanks! Andrea.

Show context
scosman ◴[] No.45774360[source]
What are folks motivation for using local coding models? Is it privacy and there's no cloud host you trust?

I love local models for some use cases. However for coding there is a big gap between the quality of models you can run at home and those you can't (at least on hardware I can afford) like GLM 4.6, Sonnet 4.5, Codex 5, Qwen Coder 408.

What makes local coding models compelling?

replies(8): >>45774482 #>>45774556 #>>45774772 #>>45775037 #>>45775069 #>>45775189 #>>45775715 #>>45776913 #
1. johnisgood ◴[] No.45774556[source]
What setup would you (or other people) recommend for a local model, and which model, if I want something like Claude Sonnet 4.5 (or actually, earlier versions, which seemed to be better)?

Anyone could chime in! I just want to have working local model that is at least as good as Sonnet 4.5, or 3.x.

replies(1): >>45774604 #
2. scosman ◴[] No.45774604[source]
Nothing open is quite as good as Sonnet 4.5 and Codex 5. GLM 4.6, MiniMax M2, Deepseek v3.2, Kimi K2 and Qwen Coder 3 are close. But those are hundreds of billions of parameters, so running locally is very very expensive.
replies(1): >>45775437 #
3. johnisgood ◴[] No.45775437[source]
That is unfortunate. I will never be able to afford such hardware that could run them. :(