←back to thread

602 points emrah | 1 comments | | HN request time: 0.221s | source
Show context
mark_l_watson ◴[] No.43745755[source]
Indeed!! I have swapped out qwen2.5 for gemma3:27b-it-qat using Ollama for routine work on my 32G memory Mac.

gemma3:27b-it-qat with open-codex, running locally, is just amazingly useful, not only for Python dev, but for Haskell and Common Lisp also.

I still like Gemini 2.5 Pro and o3 for brainstorming or working on difficult problems, but for routine work it (simply) makes me feel good to have everything open source/weights running on my own system.

Wen I bought my 32G Mac a year ago, I didn't expect to be so happy as running gemma3:27b-it-qat with open-codex locally.

replies(3): >>43750006 #>>43750021 #>>43750815 #
1. Tsarp ◴[] No.43750006[source]
What tps are you hitting? And did you have to change KV size?