←back to thread

454 points positiveblue | 1 comments | | HN request time: 0s | source
Show context
ctoth ◴[] No.45068556[source]
The web doesn't need attestation. It doesn't need signed agents. It doesn't need Cloudflare deciding who's a "real" user agent. It needs people to remember that "public" means PUBLIC and implement basic damn rate limiting if they can't handle the traffic.

The web doesn't need to know if you're a human, a bot, or a dog. It just needs to serve bytes to whoever asks, within reasonable resource constraints. That's it. That's the open web. You'll miss it when it's gone.

replies(9): >>45068690 #>>45068959 #>>45069370 #>>45069779 #>>45069921 #>>45070226 #>>45070359 #>>45071126 #>>45071216 #
johncolanduoni ◴[] No.45068690[source]
Basic damn rate limiting is pretty damn exploitable. Even ignoring botnets (which is impossible), usefully rate limiting IPv6 is anything but basic. If you just pick some prefix from /48 to /64 to key your rate limits on, you'll either be exploitable by IPs from providers that hand out /48s like candy or you'll bucket a ton of mobile users together for a single rate limit.
replies(1): >>45068822 #
ctoth ◴[] No.45068822[source]
You make unauthenticated requests cheap enough that you don't care about volume. Reserve rate limiting for authenticated users where you have real identity. The open web survives by being genuinely free to serve, not by trying to guess who's "real."

A basic Varnish setup should get you most of the way there, no agent signing required!

replies(3): >>45068881 #>>45069206 #>>45070262 #
Lammy ◴[] No.45069206[source]
> You make unauthenticated requests cheap enough that you don't care about volume.

In the days before mandatory TLS it was so easy to set up a Squid proxy on the edge of my network and cache every plain-HTTP resource for as long as I want.

Like yeah, yeah, sure, it sucked that ISPs could inject trackers and stuff into page contents, but I'm starting to think the downsides of mandatory TLS outweigh the upsides. We made the web more Secure at the cost of making it less Private. We got Google Analytics and all the other spyware running over TLS and simultaneously made it that much harder for any normal person to host anything online.

replies(1): >>45069596 #
AnthonyMouse ◴[] No.45069596{3}[source]
You can still do that, you have the caching reverse proxy at the edge of the network be the thing that terminates TLS.
replies(1): >>45069838 #
Lammy ◴[] No.45069838{4}[source]
Not really. At minimum you will break all of these sites on the HSTS preload list: https://source.chromium.org/chromium/chromium/src/+/main:net...
replies(2): >>45069989 #>>45079232 #
1. TheCycoONE ◴[] No.45069989{5}[source]
Public key pinning was rejected so you just need your proxy to also supply a certificate that's trusted by your clients.