←back to thread

545 points mmh0000 | 4 comments | | HN request time: 0.605s | source
Show context
jchw ◴[] No.43572243[source]
I'm rooting for Ladybird to gain traction in the future. Currently, it is using cURL proper for networking. That is probably going to have some challenges (I think cURL is still limited in some ways, e.g. I don't think it can do WebSockets over h2 yet) but on the other hand, having a rising browser engine might eventually remove this avenue for fingerprinting since legitimate traffic will have the same fingerprint as stock cURL.
replies(6): >>43572413 #>>43573011 #>>43574225 #>>43576912 #>>43580376 #>>43583469 #
userbinator ◴[] No.43576912[source]
but on the other hand, having a rising browser engine might eventually remove this avenue for fingerprinting

If what I've seen from CloudFlare et.al. are any indication, it's the exact opposite --- the amount of fingerprinting and "exploitation" of implementation-defined behaviour has increased significantly in the past few months, likely in an attempt to kill off other browser engines; the incumbents do not like competition at all.

The enemy has been trying to spin it as "AI bots DDoSing" but one wonders how much of that was their own doing...

replies(3): >>43577273 #>>43578375 #>>43578710 #
hansvm ◴[] No.43577273[source]
Hold up, one of those things is not like the other. Are we really blaming webmasters for 100x increases in costs from a huge wave of poorly written and maliciously aggressive bots?
replies(2): >>43577293 #>>43584557 #
1. jillyboel ◴[] No.43584557[source]
Your costs only went up 100x if you built your site poorly
replies(1): >>43587562 #
2. hansvm ◴[] No.43587562[source]
I'll bite. How do you serve 100x the traffic without 100x the costs? It costs something like 1e-10 dollars to serve a recipe page with a few photos, for example. If you serve it 100x more times, how does that not scale up?
replies(2): >>43588274 #>>43591692 #
3. jillyboel ◴[] No.43588274[source]
It might scale up but if you're anywhere near efficient you're way overprovisioned to begin with. The compute cost should be miniscule due to caching and bandwidth is cheap if you're not with one of the big clouds. As an example, according to dang HN runs on a single server and yet many websites that get posted to HN, and thus receive a fraction of the traffic, go down due to the load.
4. immibis ◴[] No.43591692[source]
You got 100x the traffic if your traffic was near zero to begin with.