Most active commenters
  • JohnFen(3)

←back to thread

756 points dagurp | 34 comments | | HN request time: 2.058s | source | bottom
1. haburka ◴[] No.36882152[source]
Very controversial take but I think this benefits the vast majority of users by allowing them to bypass captchas. I’m assuming that people would use this API to avoid showing real users captchas, not completely prevent them from browsing the web.

Unfortunately people who have rooted phones, who use nonstandard browsers are not more than 1% of users. It’s important that they exist, but the web is a massive platform. We can not let a tyranny of 1% of users steer the ship. The vast majority of users would benefit from this, if it really works.

However i could see that this tool would be abused by certain websites and prevent users from logging in if on a non standard browser, especially banks. Unfortunate but overall beneficial to the masses.

Edit: Apparently 5% of the time it intentionally omits the result so it can’t be used to block clients. Very reasonable solution.

replies(9): >>36882205 #>>36882206 #>>36882230 #>>36882275 #>>36882280 #>>36882408 #>>36882411 #>>36882428 #>>36882700 #
2. JohnFen ◴[] No.36882205[source]
> I think this benefits the vast majority of users by allowing them to bypass captchas.

I don't think it does that. Nothing about this reduces the problem that captchas are attempting to solve.

> i could see that this tool would be abused by certain websites and prevent users from logging in if on a non standard browser, especially banks.

That's not abusing this tool. That's the very thing that this is intended to allow.

replies(2): >>36882282 #>>36882284 #
3. adamrezich ◴[] No.36882206[source]
how often do normal users see CAPTCHAs these days? I seldom see one anymore.
replies(4): >>36882290 #>>36882299 #>>36882915 #>>36885097 #
4. version_five ◴[] No.36882230[source]
Most captchas these days are already only there to enforce Google's monopoly. If you use and "approved" browser and let them track you, you don't get one, browse anonymously and you can't get past. That ship has already sailed and it's already evil, anticompetitive behavior.
5. wbobeirne ◴[] No.36882275[source]
> Unfortunately people who have rooted phones, who use nonstandard browsers are not more than 1% of users

Depends on what you count as "nonstandard", but various estimates put non-top 6 browser usage at between 3-12% (https://en.wikipedia.org/wiki/Usage_share_of_web_browsers#Su...) and non-Windows/macOS/iOS/Android usage at ~4% (https://en.wikipedia.org/wiki/Usage_share_of_operating_syste....) These also don't take into account traffic on older operating systems or hardware that would be incompatible with these attestations, or clients that spoof their user agent for anonymity.

In an ideal world, we would see this number grow, not shrink. It's not good for consumers if our choices dwindle to just one or two options.

6. dotancohen ◴[] No.36882280[source]

  > We can not let a tyranny of 1% of users steer the ship.
Far less than 1% of my users use the accessibility features. In fact, it is closer to 1% of 1%. Does that justify the far, far easier development and bug testing that I would enjoy if I were to stop providing accessibility features?
7. ◴[] No.36882282[source]
8. ec109685 ◴[] No.36882284[source]
The explicit goals are thus:

* Allow web servers to evaluate the authenticity of the device and honest representation of the software stack and the traffic from the device.

* Offer an adversarially robust and long-term sustainable anti-abuse solution.

* Don't enable new cross-site user tracking capabilities through attestation. Continue to allow web browsers to browse the Web without attestation.

From: https://github.com/RupertBenWiser/Web-Environment-Integrity/...

If it actually won't do any of those things, then that should be debated first.

replies(1): >>36882329 #
9. dotancohen ◴[] No.36882290[source]
I see them all the time. Firefox on Ubuntu.
replies(1): >>36882321 #
10. hellojesus ◴[] No.36882299[source]
I get them often, especially with more privacy features turned on. But usually a VPN is enough to trigger it when visiting Google domains.
replies(1): >>36882397 #
11. adamrezich ◴[] No.36882321{3}[source]
but where? which websites? I haven't seen a CAPTCHA on the reg since I stopped posting to 4chan some years back.
replies(2): >>36882476 #>>36882980 #
12. JohnFen ◴[] No.36882329{3}[source]
Captchas are intended to stop bots. WEI is intended to vet that the hardware and browser has been validated. That doesn't impact bots, because you can implement bots on top of a valid hardware and browser so it will pass the WEI check.
replies(3): >>36882491 #>>36883374 #>>36886484 #
13. Given_47 ◴[] No.36882397{3}[source]
There’s a select few mullvad addresses that don’t trigger it but the majority I use I’ll get hit with them.

I honestly find it more concerning when I’m expecting one and I don’t get served a ridiculous puzzle to solve.

14. jdrek1 ◴[] No.36882408[source]
> We can not let a tyranny of 1% of users steer the ship.

Normally I'd agree with you on that the tyranny of the minority is a bad thing, but sometimes the minority actually has a point and this is one of the cases where the minority is _objectively_ correct and letting the majority decide would end up in a complete dystopia. Democracy only works if everyone is informed (and able to think logically/critically, not influenced (either by force or by salary), etc.) and in this case the 99% simply do not have any clue on the effects of this being implemented (nor do they care). This entire proposal is pure orwellian shit.

15. idreyn ◴[] No.36882411[source]
WEI acts as proof that "this is a browser", not "this is a human". But browsers can be automated with tools like Selenium. I'd guess that with the advent of complicated, JS-based captchas, browsers under automation are already the major battleground between serious scrapers and anti-bot tools.

I also don't understand how WEI does much to prevent a motivated user from faking requests. If you have Chrome running on your machine it's not gonna be too hard to extract a signed WEI token from its execution, one way or another, and pass that along with your Python script.

It looks like it basically gives Google another tool to constrain users' choices.

replies(1): >>36882681 #
16. mindslight ◴[] No.36882428[source]
That is not controversial at all, but rather a plain fact about the short term incentives! If adoption of this technology weren't an attractor, then we'd have nothing to worry about. But the problem is the functionality of this spec, supported by the fundamental backdoor of corporate TPMs, is set up to facilitate power dynamics that inevitably result in full corporate control over everyone's computing environment.
17. hnav ◴[] No.36882476{4}[source]
Most websites have them, just browse in incognito and either override your user-agent to something funky or connect through a known VPN.
replies(1): >>36883318 #
18. jrockway ◴[] No.36882491{4}[source]
I remember the discussions on Slashdot many years ago about the "analog hole"; you can have all the DRM you want, but people can still point a camera at the screen and record a non-encumbered copy that way. This is definitely the case with automating web activities; you take a trusted computer, point a camera at it, and have your bot synthesize keypresses and mouse movements. There is absolutely no way for a website at the other end of the Internet to know that a human is using the computer. (I see this as the "end game" for FPS cheating. I don't think anyone is doing it yet, but it's bound to happen.)

I'm guessing the reason we want attestation is so that Chrome can drop ad blockers and websites can drop non-Chrome browsers. But there is no reason why you can't do the thing where you point a video camera at a monitor, have AI black out the ads, and then view the edited video feed instead of the real one.

The only use for attestation I see is for work-from-home corporate Intranets. Sure, make sure that OS is up to date before you're willing to send High-Value Intellectual Property to the laptop. That... already works and doesn't involve web standards. (At my current job, I'm in the hilarious position where all of our source code is open-source and anyone on Earth can edit it, but I have to use a trusted computer to do things like anti-discrimination training. It's like opsec backwards. But, the attestation works fine, no new tech needed.)

replies(2): >>36883271 #>>36889696 #
19. Spivak ◴[] No.36882681[source]
> But browsers can be automated with tools like Selenium

And I will bet anything that if the browser is being instrumented via webdriver it will attest as such. You would have to automate the browser externally.

replies(1): >>36883527 #
20. insanitybit ◴[] No.36882700[source]
There are obvious benefits here. The ability to remove captchas is one, the ability to ensure that clients are running the latest updates before accessing sensitive content, etc.

But the power is too significant. If it were some small subset of positive assertions I'd be ok with this, but the ability to perform arbitrary attestation is beyond what is required and is far too abusable.

21. Ylpertnodi ◴[] No.36882915[source]
Quite a few: brave browser + mullvad vpn. I enjoy doing captchas wrong, manly because i can't believe how US fire hydrants, busses, and crosswalks have become so important to me.
22. i_love_cookies ◴[] No.36882980{4}[source]
almost anything using cloudflare
23. pests ◴[] No.36883271{5}[source]
> and have your bot synthesize keypresses and mouse movements

Is this truely going to work though? Captcha provider already monitor mouse and keyboard movement while on the page. Can you really "synthesize" human-like mouse movements around the page? I'm not so sure.

replies(3): >>36883422 #>>36883685 #>>36883927 #
24. pests ◴[] No.36883318{5}[source]
Ah, just do something completely nonstandard (sans incognito) and the website will stop working.
25. danShumway ◴[] No.36883374{4}[source]
This is also how you know the "this won't impact extensions" talk is likely nonsense.

If you can still run extensions you still need captchas. So one possible road this takes is Google launches it, everybody still uses captchas because extensions in desktop browsers still make automating requests trivial -- and then we lock down extensions because "we already locked down the hardware and we really do need to do something about captchas..."

26. jrockway ◴[] No.36883422{6}[source]
I am sure you can. This is exactly what AI excels at!
27. danShumway ◴[] No.36883527{3}[source]
Will it attest that it's running an extension? I can intercept and modify web requests, redirect web requests, and send web requests to other domains through a web extension. I can also scrape the HTML and I can use native messaging or normal HTTP requests to send that information out of the browser. And I can also modify CORS headers to get rid of restrictions around sending requests from another domain.

I can't literally emulate mouse movements but the only place that matters is... captchas. If you're not watching for those kinds of behaviors, then a browser even without webdriver can be automated just fine. And if you are watching for those behaviors, then you're running a captcha, so what is WEI helping with?

Google claims this is not going to impact browser extensions, debugging, etc... but if it's not going to impact that stuff, then it's not really helpful for guaranteeing that the user isn't automating requests. What it is helpful for is reducing user freedom around their OS/hardware and setting the stage for attacking extensions like adblockers more directly in the future.

28. tikhonj ◴[] No.36883685{6}[source]
Captcha providers can't rely exclusively on mouse movement because of accessibility considerations, and it seems pretty easy to emulate human-like keyboard interaction. Emulating realistic mouse movement is more difficult but probably doable too.
replies(1): >>36885667 #
29. JohnFen ◴[] No.36883927{6}[source]
> Can you really "synthesize" human-like mouse movements around the page?

Yes. It's not even very hard.

30. drbawb ◴[] No.36885097[source]
I built a new PC for a friend, and getting the AM5 platform stable was ridiculously challenging, so there were several reinstallations of Windows involved. He didn't use a password manager, so there were a lot of logging in, password resets etc. involved. For virtually every service he had to login to he was asked to complete a CAPTCHA. For Steam in particular: he had to do the first login on the website, because the CAPTCHA inside the application appeared to be bugged and was more like psychological warfare than human-verification. The frustration was palpable.

Also turn on a VPN some time (a signal to Google et al. that you're trying to bypass content region-restrictions, or funnel mobile traffic through an ad-blocker) and you are basically guaranteed to see nothing but CAPTCHAs from the predominantly CloudFlare owned and operated Internet.

So yes, it's a big problem, but only if your web environment (tracking metadata) are not sufficiently "trusted" :D

31. hellojesus ◴[] No.36885667{7}[source]
I bet it's pretty easy. Capture your own mouse movements from one place to the next as denoted by clicks. Then train a model on reproducing those movements, using your captured data of movement from points A to B. It would probably generalize well enough to pass the verifications. Humans are very unpredictable, so I assume those are mostly looking for superhuman speed and accuracy.
replies(1): >>36887485 #
32. ec109685 ◴[] No.36886484{4}[source]
> We're still discussing whether each of the following pieces of information should be included and welcome your feedback:

  * The device integrity verdict must be low entropy, but what granularity of verdicts should we allow? Including more information in the verdict will cover a wider range of use cases without locking out older devices.
  * A granular approach proved useful previously in the Play Integrity API.
  * The platform identity of the application that requested the attestation, like com.chrome.beta, org.mozilla.firefox, or com.apple.mobilesafari.
  * Some indicator enabling rate limiting against a physical device
33. costco ◴[] No.36887485{8}[source]
https://github.com/vincentbavitz/bezmouse

> BezMouse is a lightweight tool written in Python to simulate human-like mouse movements with Bézier curves. Some applications might include:

> BezMouse was originally written for a RuneScape color bot and has never triggered macro detection in over 400 hours of continuous use.

:)

34. CrimsonRain ◴[] No.36889696{5}[source]
> I see this as the "end game" for FPS cheating. I don't think anyone is doing it yet, but it's bound to happen.

You're behind the times. It's not widespread but it's been happening for years.

Also the other day selenium author (iirc) said they are working on such a thing for "automated testing"

So this proposal will do nothing to prevent bots; maybe increase the cost a little.

On the other hand, it will surely discriminate people, new emerging technology and companies. No other search engines can be built. No new browsers. No openness.

Anyone supporting this proposal is either pure evil or stupid or both.