←back to thread

Claude for Chrome

(www.anthropic.com)
795 points davidbarker | 1 comments | | HN request time: 0.211s | source
Show context
medhir ◴[] No.45031022[source]
Personally, the only way I’m going to give an LLM access to a browser is if I’m running inference locally.

I’m sure there’s exploits that could be embedded into a model that make running locally risky as well, but giving remote access to Anthropic, OpenAI, etc just seems foolish.

Anyone having success with local LLMs and browser use?

replies(3): >>45031462 #>>45031772 #>>45033430 #
1. alienbaby ◴[] No.45031462[source]
I'm not sure how running inference locally will make any difference whatsoever? or do you also mean hosting the MCP tools it has access to?