←back to thread

127 points parsabg | 2 comments | | HN request time: 0.431s | source

Hey HN,

I'm excited to share BrowserBee, a privacy-first AI assistant in your browser that allows you to run and automate tasks using your LLM of choice (currently supports Anthropic, OpenAI, Gemini, and Ollama). Short demo here: https://github.com/user-attachments/assets/209c7042-6d54-4fc...

Inspired by projects like Browser Use and Playwright MCP, its main advantage is the browser extension form factor which makes it more convenient for day to day use, especially for less technical users. Its also a bit less cumbersome to use on websites that require you to be logged in, as it attaches to the same browser instance you use (on privacy: the only data that leaves your browser is the communication with the LLM - there is no tracking or data collection of any sort).

Some of its core features are as follows:

- a memory feature which allows users to memorize common and useful pathways, making the next repetition of those tasks faster and cheaper

- real-time token counting and cost tracking (inspired by Cline)

- an approval flow for critical tasks such as posting content or making payments (also inspired by Cline)

- tab management allowing the agent to execute tasks across multiple tabs

- a range of browser tools for navigation, tab management, interactions, etc, which are broadly in line with Playwright MCP

I'm actively developing BrowserBee and would love to hear any thoughts, comments, or feedback.

Feel free to reach out via email: parsa.ghaffari [at] gmail [dot] com

-Parsa

1. matula ◴[] No.44022518[source]
Very nice. I tried with Ollama and it works well.

The biggest issue is having the Ollama models hardcoded to Qwen3 and Llama 3.1. I imagine most Ollama users have their favorites, and probably vary quite a bit. My main model is usually Gemma 3 12B, which does support images.

It would be a nice feature to have a custom config on the Ollama settings page, save those to Chrome storage, and use that in the 'getAvailableModels' method, along with the hardcoded models.

replies(1): >>44022778 #
2. parsabg ◴[] No.44022778[source]
Great suggestion, will add custom Ollama configurations to the next release