←back to thread

MCP in LM Studio

(lmstudio.ai)
227 points yags | 2 comments | | HN request time: 1.205s | source
1. xyc ◴[] No.44382710[source]
Great to see more local AI tools supporting MCP! Recently I've also added MCP support to recurse.chat. When running locally (LLaMA.cpp and Ollama) it still needs to catch up in terms of tool calling capabilities (for example tool call accuracy / parallel tool calls) compared to the well known providers but it's starting to get pretty usable.
replies(1): >>44382967 #
2. rshemet ◴[] No.44382967[source]
hey! we're building Cactus (https://github.com/cactus-compute), effectively Ollama for smartphones.

I'd love to learn more about your MCP implementation. Wanna chat?