←back to thread

326 points threeturn | 1 comments | | HN request time: 0.001s | source

Dear Hackers, I’m interested in your real-world workflows for using open-source LLMs and open-source coding assistants on your laptop (not just cloud/enterprise SaaS). Specifically:

Which model(s) are you running (e.g., Ollama, LM Studio, or others) and which open-source coding assistant/integration (for example, a VS Code plugin) you’re using?

What laptop hardware do you have (CPU, GPU/NPU, memory, whether discrete GPU or integrated, OS) and how it performs for your workflow?

What kinds of tasks you use it for (code completion, refactoring, debugging, code review) and how reliable it is (what works well / where it falls short).

I'm conducting my own investigation, which I will be happy to share as well when over.

Thanks! Andrea.

Show context
sho ◴[] No.45773247[source]
Real-world workflows? I'm all for local LLM, tinker with it all the time, but for productive coding use no local LLM approaches cloud and it's not even close. There's no magic trick or combination of pieces, it just turns out that a quarter million dollars worth of H200s is just much, much better than anything a normal person could possibly deploy at home.

Give it time, we'll get there, but not anytime soon.

replies(2): >>45773307 #>>45774111 #
exac ◴[] No.45774111[source]
I thought you would just use another computer in your house for the flows?

My development flow takes a lot of RAM (and yes I can run it minimally editing in the terminal with language servers turned off), so I wouldn't consider running the local LLM on the same computer.

replies(1): >>45780410 #
1. sho ◴[] No.45780410[source]
It's not about which of your computers you run it on, it's about the relative capability of any system you're likely to own vs. what a cloud provider can do. The difference is hilarious - probably 100x. Knowing that, unless you have good reasons (and experimenting/playing around IS a good reason) - not many people would choose to actually base their everyday workflow on an all-local setup.

It's sort of like doing all your work on an 80386. Can it be made to work? Probably. Are you going to learn a whole lot making it work? Without a doubt! Are you going to be the fastest dev on the team? No.