←back to thread

166 points rldjbpin | 1 comments | | HN request time: 0s | source
Show context
deadbabe ◴[] No.45114029[source]
The stakes aren’t that high yet for Ollama to warrant cumbersome auth mechanisms.
replies(2): >>45114139 #>>45114224 #
reilly3000 ◴[] No.45114224[source]
If any MCP servers are running, anyone with access to query the chat endpoint can use them. That could include file system access, GitHub tokens and more.
replies(3): >>45114274 #>>45114972 #>>45115474 #
1. stoneyhrm1 ◴[] No.45114972[source]
The LLM endpoint via ollama or huggingface is not the one executing MCP tool calls, that is on behalf of the client that is interacting with the LLM. All the LLM does is take input as a prompt and produce a text output, that's it. Anything else is just a wrapper.