The web doesn't need to know if you're a human, a bot, or a dog. It just needs to serve bytes to whoever asks, within reasonable resource constraints. That's it. That's the open web. You'll miss it when it's gone.
The web doesn't need to know if you're a human, a bot, or a dog. It just needs to serve bytes to whoever asks, within reasonable resource constraints. That's it. That's the open web. You'll miss it when it's gone.
A basic Varnish setup should get you most of the way there, no agent signing required!
In the days before mandatory TLS it was so easy to set up a Squid proxy on the edge of my network and cache every plain-HTTP resource for as long as I want.
Like yeah, yeah, sure, it sucked that ISPs could inject trackers and stuff into page contents, but I'm starting to think the downsides of mandatory TLS outweigh the upsides. We made the web more Secure at the cost of making it less Private. We got Google Analytics and all the other spyware running over TLS and simultaneously made it that much harder for any normal person to host anything online.
If you have three local machines, you might be able to turn three queries into one, assuming they all visit the same site instead of different people using different sites.
If you do this on the server, a request that requires the execution of PHP code and three SQL queries goes from happening on every request for the same resource to happening once and then the subsequent requests are just shoveling the cached response back out the pipe instead of having to process it again. Instead of reducing the number of requests that reach the back end by 3:1 you reduce it by a million to one.
And that doesn't cause any HSTS problems because a reverse proxy operated by the site owner has the real certificate in it.