←back to thread

756 points dagurp | 8 comments | | HN request time: 0.001s | source | bottom
1. bloopernova ◴[] No.36882508[source]
Would this end up breaking curl, or any other tool that accesses https?
replies(3): >>36882597 #>>36883468 #>>36885184 #
2. fooyc ◴[] No.36882597[source]
Yes it will
replies(1): >>36883267 #
3. pdanpdan ◴[] No.36883267[source]
How?
replies(1): >>36883425 #
4. toyg ◴[] No.36883425{3}[source]
The whole point of WEI is that the site can choose to block any combination of browser and OS they see fit, in a reliable way (currently, browsers can freely lie). CURL and friends will almost immediately be branded as bots and banned - that's the stated objective.
replies(2): >>36883609 #>>36883638 #
5. collaborative ◴[] No.36883468[source]
It will, but curl and others will likely simply be upgraded with a puppeteer of sorts that plugs into your chrome runtime. So this will have prevented nothing (except force not technical users to adopt chrome and thus kill all new browser incumbents, offering the chance to force feed even more google ads)
6. pdanpdan ◴[] No.36883609{4}[source]
How?

The page must first load, then it requests an attestation using js and sends it back to the server for further use (like a recaptcha token).

So for something like curl it could be no change.

https://github.com/RupertBenWiser/Web-Environment-Integrity/...

7. snvzz ◴[] No.36883638{4}[source]
It is more severe than that. The design favors a whitelist approach: Only browsers that can get the attestation from a "trusted source" are allowed. Browsers that cannot, don't.
8. pravus ◴[] No.36885184[source]
Yes and no.

The attestation API will allow websites to verify certain things about the user agent which they then may use to either deny access or alter the access for the requested resource. This is similar to existing methods of checking the "User-Agent" header string but is much more robust to tampering because it can rely on a full-chain of trust from the owning website.

So will existing tools work with this?

Websites that do not require attestation should work fine. This will probably be the vast majority of websites.

Websites that require attestation may or may not work depending on the results of the attestation. Since programs like curl do not currently provide a mechanism to perform attestation, they will indicate a failure. If the website is configured to disallow failed attestation attempts, then tools like curl will no longer be able to access the same resources that user agents that pass attestation can.

My opinion is that it is likely that attestation will be used for any website where there is a large media presence (copyright/drm), large data presence (resource utilization/streams), high security, or any large company that is willing to completely segment its web resources into attested and non-attested versions. Tools like curl will no longer work with these sites until either a suitable attestation system is added to them, or the company changes its attestation policy.