←back to thread

MCP in LM Studio

(lmstudio.ai)
225 points yags | 1 comments | | HN request time: 0s | source
Show context
minimaxir ◴[] No.44380112[source]
LM Studio has quickly become the best way to run local LLMs on an Apple Silicon Mac: no offense to vllm/ollama and other terminal-based approaches, but LLMs have many levers for tweaking output and sometimes you need a UI to manage it. Now that LM Studio supports MLX models, it's one of the most efficient too.

I'm not bullish on MCP, but at the least this approach gives a good way to experiment with it for free.

replies(4): >>44380220 #>>44380533 #>>44380699 #>>44381188 #
zackify ◴[] No.44381188[source]
Ollama doesn’t even have a way to customize the context size per model and persist it. LM studio does :)
replies(1): >>44382206 #
Anaphylaxis ◴[] No.44382206[source]
This isn't true. You can `ollama run {model}`, `/set parameter num_ctx {ctx}` and then `/save`. Recommended to `/save {model}:{ctx}` to persist on model update
replies(2): >>44385978 #>>44386362 #
1. truemotive ◴[] No.44385978[source]
This can be done with custom Modelfiles as well, I was pretty bent when I found out that 2048 was the default context length.

https://ollama.readthedocs.io/en/modelfile/