←back to thread

MCP in LM Studio

(lmstudio.ai)
225 points yags | 4 comments | | HN request time: 0s | source
Show context
minimaxir ◴[] No.44380112[source]
LM Studio has quickly become the best way to run local LLMs on an Apple Silicon Mac: no offense to vllm/ollama and other terminal-based approaches, but LLMs have many levers for tweaking output and sometimes you need a UI to manage it. Now that LM Studio supports MLX models, it's one of the most efficient too.

I'm not bullish on MCP, but at the least this approach gives a good way to experiment with it for free.

replies(4): >>44380220 #>>44380533 #>>44380699 #>>44381188 #
1. nix0n ◴[] No.44380220[source]
LM Studio is quite good on Windows with Nvidia RTX also.
replies(1): >>44383574 #
2. boredemployee ◴[] No.44383574[source]
care to elaborate? i have rtx 4070 12gb vram + 64gb ram, i wonder what models I can run with it. Anything useful?
replies(1): >>44388014 #
3. nix0n ◴[] No.44388014[source]
LM Studio's model search is pretty good at showing what models will fit in your VRAM.

For my 16gb of VRAM, those models do not include anything that's good at coding, even when I provide the API documents via PDF upload (another thing that LM Studio makes easy).

So, not really, but LM Studio at least makes it easier to find that out.

replies(1): >>44389969 #
4. boredemployee ◴[] No.44389969{3}[source]
ok, ty for the reply!