←back to thread

MCP in LM Studio

(lmstudio.ai)
240 points yags | 1 comments | | HN request time: 0.21s | source
Show context
minimaxir ◴[] No.44380112[source]
LM Studio has quickly become the best way to run local LLMs on an Apple Silicon Mac: no offense to vllm/ollama and other terminal-based approaches, but LLMs have many levers for tweaking output and sometimes you need a UI to manage it. Now that LM Studio supports MLX models, it's one of the most efficient too.

I'm not bullish on MCP, but at the least this approach gives a good way to experiment with it for free.

replies(4): >>44380220 #>>44380533 #>>44380699 #>>44381188 #
nix0n ◴[] No.44380220[source]
LM Studio is quite good on Windows with Nvidia RTX also.
replies(1): >>44383574 #
boredemployee ◴[] No.44383574[source]
care to elaborate? i have rtx 4070 12gb vram + 64gb ram, i wonder what models I can run with it. Anything useful?
replies(2): >>44388014 #>>44405120 #
1. Eupolemos ◴[] No.44405120[source]
If you go to huggingface.co, you can tell it what specs you have and when you go to a model, it'll show you what variations of that model are likely to run well.

So if you go to this[0] random model, on the right there is a list of quantifications based on bits, and those you can run will be shown in green.

[0] https://huggingface.co/unsloth/Mistral-Small-3.1-24B-Instruc...