←back to thread

237 points robin_reala | 3 comments | | HN request time: 0s | source
Show context
scq ◴[] No.43514594[source]
This seems like a bug in the ScreenAI service? There's no evidence whatsoever for his claim that Google "trains a machine vision model on the contents of my screen".

According to https://chromium.googlesource.com/chromium/src/+/main/servic... it is just inference.

> These functionalities are entirely on device and do not send any data to network or store on disk.

There is also this description in the Chrome OS source code:

> ScreenAI is a binary to provide AI based models to improve assistive technologies. The binary is written in C++ and is currently used by ReadAnything and PdfOcr services on Chrome OS.

replies(2): >>43514631 #>>43516050 #
bri3d ◴[] No.43514631[source]
This. Go to chrome://flags and disable “Enable OCR For Local Image Search” and I bet the problem goes away.

It’s a stupid feature for Google to enable by default on systems that are generally very low spec and badly made, but it’s not some evil data slurp. One of the most obnoxious things about enshittification is the corrosive effect it seems to have had on technical users’ curiosity: instead of researching and fixing problems, people now seem very prone to jump to “the software is evil and bad” and give up at doing any kind of actual investigation.

replies(8): >>43514714 #>>43514728 #>>43514740 #>>43514895 #>>43514932 #>>43515032 #>>43515378 #>>43515585 #
throwaway48476 ◴[] No.43514728[source]
In other words, tech companies have lost the benefit of doubt.

That's what a decade of enshittification gets them.

replies(1): >>43514913 #
1. oefrha ◴[] No.43514913{3}[source]
A decade ago people were also posting these outrage posts about Big Tech (and Small Tech) that more often than not turned out to be bugs/nothingburgers. I was here.
replies(1): >>43517922 #
2. josephg ◴[] No.43517922[source]
I was too. There was a period of great optimism around the web, Google and (in part), Apple. But it was closer to 20 years ago now.

I remember talking to someone from Microsoft around that time. (Who were an enemy of the opensource world at the time). They said the shine would wear off, and everyone would get annoyed and distrustful of Google too. I remember my conscious brain agreeing. But my emotional mind loved Google - we all did. I just couldn’t imagine it.

Well. It’s pretty easy to imagine now.

replies(1): >>43525297 #
3. bri3d ◴[] No.43525297[source]
I think the fall happened a long time ago. It's funny - I'm recently accused often of wearing rose-colored glasses on HN, but I think the present is actually quite a bit better than 15 years ago was when it comes to privacy. It's easy to forget how bad things were for a while in there.

15 years ago now I think Google were at their worst. Google were doing a good job in my eyes until roughly the time of the DoubleClick acquisition, when they pivoted away from "we're going to do ads the Good Way with AdWords" and into "screw it, we're just going to do ads," picked up the infamous DoubleClick cookie and their general "we profile people using every piece of data we can possibly think of" approach, and started making insane product decisions like public-contacts-by-default Google Buzz.

Since then, through a combination of courts forcing them to and what seems like a somewhat genuine internal effort, Google have been adding privacy controls back in many places. I certainly don't agree with the model still, but I think that Google in 2025 are actually much less of a privacy threat than 2010 Google were.

Outside Google, 15 years ago was also the peak Browser Toolbar and Installer Wrapper Infostealers era, where instead of building crypto scams or AI-wrapper companies, the hustle bros were busy building flat-out spyware instead.

I know I'm outside of the majority on HN recently, but I generally feel that the corporate _notion_ of user privacy has actually gotten a lot better since the early 2000s, while the _implementation_ has gotten worse. That is to say, companies, especially large ones, care much more about internal controls and have much less of a "we steal lots of data and figure out how to sell it later" model. Unfortunately, at the same time, we've seen the rise of "data driven" product management, always on updates, and "product telemetry," which erode the new attitude towards privacy at a technical level by building easily exploitable troves of sensitive information.

Of course, in exchange for large companies becoming more conscious about privacy, we now have a million smaller companies working to fill the "we steal all the data" shoes. It's still a battle that's far from won.