←back to thread

MCP in LM Studio

(lmstudio.ai)
225 points yags | 1 comments | | HN request time: 0.205s | source
Show context
patates ◴[] No.44380448[source]
What models are you using on LM Studio for what task and with how much memory?

I have a 48GB macbook pro and Gemma3 (one of the abliterated ones) fits my non-code use case perfectly (generating crime stories which the reader tries to guess the killer).

For code, I still call Google to use Gemini.

replies(4): >>44380643 #>>44380718 #>>44381684 #>>44382987 #
1. robbru ◴[] No.44381684[source]
I've been using the Google Gemma QAT models in 4B, 12B, and 27B with LM Studio with my M1 Max. https://huggingface.co/lmstudio-community/gemma-3-12B-it-qat...