←back to thread

166 points rldjbpin | 4 comments | | HN request time: 0.466s | source
Show context
deadbabe ◴[] No.45114029[source]
The stakes aren’t that high yet for Ollama to warrant cumbersome auth mechanisms.
replies(2): >>45114139 #>>45114224 #
1. reilly3000 ◴[] No.45114224[source]
If any MCP servers are running, anyone with access to query the chat endpoint can use them. That could include file system access, GitHub tokens and more.
replies(3): >>45114274 #>>45114972 #>>45115474 #
2. jangxx ◴[] No.45114274[source]
ollama can't connect to MCP servers, it can merely run models which output instructions back to a connected system to connect to an MCP server (e.g mcphost using ollama to run a prompt and then itself connecting to an MCP server if the response requires it).
3. stoneyhrm1 ◴[] No.45114972[source]
The LLM endpoint via ollama or huggingface is not the one executing MCP tool calls, that is on behalf of the client that is interacting with the LLM. All the LLM does is take input as a prompt and produce a text output, that's it. Anything else is just a wrapper.
4. deadbabe ◴[] No.45115474[source]
That is is completely false, ollama has nothing to do with running commands, it just processes prompts to text responses.