←back to thread

MCP in LM Studio

(lmstudio.ai)
225 points yags | 1 comments | | HN request time: 0.205s | source
Show context
xyc ◴[] No.44382710[source]
Great to see more local AI tools supporting MCP! Recently I've also added MCP support to recurse.chat. When running locally (LLaMA.cpp and Ollama) it still needs to catch up in terms of tool calling capabilities (for example tool call accuracy / parallel tool calls) compared to the well known providers but it's starting to get pretty usable.
replies(1): >>44382967 #
1. rshemet ◴[] No.44382967[source]
hey! we're building Cactus (https://github.com/cactus-compute), effectively Ollama for smartphones.

I'd love to learn more about your MCP implementation. Wanna chat?