←back to thread

253 points akyuu | 1 comments | | HN request time: 0s | source
Show context
BinaryIgor ◴[] No.45945045[source]
I wonder why is it that we get an increase in these automated scrapers and attacks as of late (some few years); is there better (open-source?) technology that allows it? Is it because hosting infrastructure is cheaper also for the attackers? Both? Something else?

Maybe the long-term solution for such attacks is to hide most of the internet behind some kind of Proof of Work system/network, so that mostly humans get to access to our websites, not machines.

replies(6): >>45945393 #>>45945467 #>>45945584 #>>45945643 #>>45945917 #>>45945959 #
marginalia_nu ◴[] No.45945467[source]
What's missing is effective international law enforcement. This is a legal problem first and foremost. As long as it's as easy as it is to get away with this stuff by just routing the traffic through a Russian or Singaporean node, it's going to keep happening. With international diplomacy going the way it has been, odds of that changing aren't fantastic.

The web is really stuck between a rock and a hard place when it comes to this. Proof of work helps website owners, but makes life harder for all discovery tools and search engines.

An independent standard for request signing and building some sort of reputation database for verified crawlers could be part of a solution, though that causes problems with websites feeding crawlers different content than users, an does nothing to fix the Sybil attack problem.

replies(4): >>45945725 #>>45945809 #>>45945986 #>>45946661 #
1. BinaryIgor ◴[] No.45946661[source]
Good points; I would definitely vouch for an independent standard for request signing + some kind of decentralized reputation system. With international law enforcement, I think there could be too many political issues for it not become corrupt