←back to thread

229 points modinfo | 1 comments | | HN request time: 0.224s | source
Show context
onion2k ◴[] No.40834838[source]
My first impression is that this should enable approximately what Apple is doing with their AI strategy (local on-device first, then filling back to a first party API, and finally something like ChatGPT), but for web users. Having it native in the browser could be really positive for a lot of use cases depending on whether the local version can do things like RAG using locally stored data, and generate structured information like JSON.

I don't think this is a terrible idea. LLM-powered apps are here to stay, so browsers making them better is a good thing. Using a local model so queries aren't flying around to random third parties is better for privacy and security. If Google can make this work well it could be really interesting.

replies(5): >>40834888 #>>40835125 #>>40835392 #>>40841907 #>>40843249 #
1. smolder ◴[] No.40843249[source]
It definitely is a terrible idea, but it follows naturally from the "browser is an operating system" approach the industry has been taking for quite a while.

With this, Goog gets to offload AI stuff to clients, but can (and will, I guarantee) sample the interactions, calling it "telemetry" and perhaps saying it's for "safety" as opposed to being blatant Orwellian spying.