←back to thread

237 points robin_reala | 1 comments | | HN request time: 0.001s | source
Show context
scq ◴[] No.43514594[source]
This seems like a bug in the ScreenAI service? There's no evidence whatsoever for his claim that Google "trains a machine vision model on the contents of my screen".

According to https://chromium.googlesource.com/chromium/src/+/main/servic... it is just inference.

> These functionalities are entirely on device and do not send any data to network or store on disk.

There is also this description in the Chrome OS source code:

> ScreenAI is a binary to provide AI based models to improve assistive technologies. The binary is written in C++ and is currently used by ReadAnything and PdfOcr services on Chrome OS.

replies(2): >>43514631 #>>43516050 #
bri3d ◴[] No.43514631[source]
This. Go to chrome://flags and disable “Enable OCR For Local Image Search” and I bet the problem goes away.

It’s a stupid feature for Google to enable by default on systems that are generally very low spec and badly made, but it’s not some evil data slurp. One of the most obnoxious things about enshittification is the corrosive effect it seems to have had on technical users’ curiosity: instead of researching and fixing problems, people now seem very prone to jump to “the software is evil and bad” and give up at doing any kind of actual investigation.

replies(8): >>43514714 #>>43514728 #>>43514740 #>>43514895 #>>43514932 #>>43515032 #>>43515378 #>>43515585 #
TeMPOraL ◴[] No.43515032[source]
> One of the most obnoxious things about enshittification is the corrosive effect it seems to have had on technical users’ curiosity: instead of researching and fixing problems, people now seem very prone to jump to “the software is evil and bad” and give up at doing any kind of actual investigation.

There's little here worth being curious about. Tech companies made sure of that. They mostly aren't doing anything particularly groundbreaking in situations like these - they're doing the stupid or the greedy thing. And, on the off chance that the tech involved is in any way interesting, it tends to have decades of security research behind it applied to mathematically guarantee we can't use it for anything ourselves - and in case that isn't enough, there's also decades of legal experience applied to stop people from building on top of the tech.

Nah, it's one thing to fix bugs for the companies back when they tried or pretended to be friendly; these days, when half the problems are intentional malfeatures or bugs in those malfeatures, it stops being fun. There are other things to be curious about, that aren't caused by attempts to disenfranchise regular computer users.

replies(1): >>43515305 #
1. bri3d ◴[] No.43515305{3}[source]
> There's little here worth being curious about.

I’m all for OP returning the computer Google broke, as sibling comments have suggested, but the curiosity route would have been fruitful for them too; I’m pretty sure the flag I posted or one of the adjacent ones will fix their issue.

I also personally found this feature kind of interesting of itself; I didn’t know that Google were doing model-based OCR and content extraction.

> on the off chance that the tech involved is in any way interesting, it tends to have decades of security research behind it applied to mathematically guarantee we can't use it for anything ourselves

My current profession and hobby is literally breaking these locks and I’m still not quite sure what you mean here. What interesting tech do you feel you can’t use or apply due to security research?

> there's also decades of legal experience applied to stop people from building on top of the tech.

Again… I’m genuinely curious what technology you feel is locked up in a legal and technical vault?

I feel that we’ve really been in a good age lately for fundamental technologies, honestly - a massive amount of AI research is published, almost all computing related sub-technologies I can think of are growing increasingly strong open-source and open-research communities (semiconductors all the way from PDK through HDL and synthesis are one space that’s been fun here recently), and with a few notable exceptions (3GPP/mobile wireless being a big one), fewer cutting edge concepts are patent encumbered than ever before.

> There are other things to be curious about, that aren't caused by attempts to disenfranchise regular computer users.

If anything I feel like this is a counter-example? It’s an innocuous and valuable feature with a bug in it. There’s nothing weird or evil going on to intentionally or even unintentionally disenfranchise users. It’s something with a feature toggle that’s happing in open source code.

> it's one thing to fix bugs for the companies back when they tried or pretended to be friendly

Here, we can agree. If a company are going to ship automatic updates, they need to be more careful about regressions than this, and they don’t deserve any benefit of the doubt on that.