Most active commenters
  • marginalia_nu(4)
  • beeflet(3)

←back to thread

253 points akyuu | 16 comments | | HN request time: 0.517s | source | bottom
Show context
BinaryIgor ◴[] No.45945045[source]
I wonder why is it that we get an increase in these automated scrapers and attacks as of late (some few years); is there better (open-source?) technology that allows it? Is it because hosting infrastructure is cheaper also for the attackers? Both? Something else?

Maybe the long-term solution for such attacks is to hide most of the internet behind some kind of Proof of Work system/network, so that mostly humans get to access to our websites, not machines.

replies(6): >>45945393 #>>45945467 #>>45945584 #>>45945643 #>>45945917 #>>45945959 #
1. marginalia_nu ◴[] No.45945467[source]
What's missing is effective international law enforcement. This is a legal problem first and foremost. As long as it's as easy as it is to get away with this stuff by just routing the traffic through a Russian or Singaporean node, it's going to keep happening. With international diplomacy going the way it has been, odds of that changing aren't fantastic.

The web is really stuck between a rock and a hard place when it comes to this. Proof of work helps website owners, but makes life harder for all discovery tools and search engines.

An independent standard for request signing and building some sort of reputation database for verified crawlers could be part of a solution, though that causes problems with websites feeding crawlers different content than users, an does nothing to fix the Sybil attack problem.

replies(4): >>45945725 #>>45945809 #>>45945986 #>>45946661 #
2. luckylion ◴[] No.45945725[source]
It's not necessarily going through a Russian or Singaporean node though, on the sites I'm responsible for, AWS, GCP, Azure are in the top 5 for attackers. It's just that they don't care _at all_ about that happening.

I don't think you need world-wide law-enforcement, it'll be a big step ahead if you make owners & operators liable. You can limit exposure so nobody gets absolutely ruined, but anyone running wordpress 4.2 and getting their VPS abused for attacks currently has 0 incentive to change anything unless their website goes down. Give them a penalty of a few hundred dollars and suddenly they do. To keep things simple, collect from the hosters, they can then charge their customers, and suddenly they'll be interested in it as well, because they don't want to deal with that.

The criminals are not held liable, and neither are their enablers. There's very little chance anything will change that way.

replies(1): >>45946156 #
3. Aurornis ◴[] No.45945809[source]
> What's missing is effective international law enforcement.

International law enforcement on the Internet would also subject you to the laws of other countries. It goes both ways.

Having to comply with all of the speech laws and restrictions in other countries is not actually something you want.

replies(2): >>45945922 #>>45946229 #
4. ocdtrekkie ◴[] No.45945922[source]
This is already kind of true with every global website, the idea of a single global internet is one of those fairy tale fantasy things, that maybe happened for a little bit before enough people used it. In many cases it isn't really ideal today.
5. armchairhacker ◴[] No.45945986[source]
I don’t think this can solved legally without compromising anonymity. You can block unrecognized clients and punish the owners of clients that behave badly, but then, for example, an oppressive government can (physically) take over a subversive website and punish everyone who accesses it.

Maybe pseudo-anonymity and “punishment” via reputation could work. Then an oppressive government with access to a subversive website (ignoring bad security, coordination with other hijacked sites, etc.) can only poison its clients’ reputations, and (if reputation is tied to sites, who have their own reputations) only temporarily.

replies(1): >>45946200 #
6. mrweasel ◴[] No.45946156[source]
The big cloud provides needs to step up and take responsibility. I understand that it can't be to easy to do, but we really do need a way to contact e.g. AWS and tell them to shut of a costumer. I have no problem with someone scraping our websites, but I care that they don't do so responsibly, slow down when we start responding slower, don't assume that you can just go full throttle, crash our site, wait, and then do it again once we start responding again.

You're absolutely right: AWS, GCP, Azure and others, they do not care and especially AWS and GCP are massive enablers.

replies(1): >>45946560 #
7. ajuc ◴[] No.45946200[source]
> but then, for example, an oppressive government can (physically) take over a subversive website and punish everyone who accesses it.

Already happens. Oppressive governments already punish people for visiting "wrong" websites. They already censor internet.

There are no technological solutions to coordination problems. Ultimately, no matter what you invent, it's politics that will decide how it's used and by whom.

8. marginalia_nu ◴[] No.45946229[source]
We have historically solved this via treaties.

If you want to trade with me, a country that exports software, let's agree to both criminalize software piracy.

No reason why this can't be extended to DDoS attacks.

replies(1): >>45947147 #
9. ctoth ◴[] No.45946560{3}[source]
> we really do need a way to contact e.g. AWS and tell them to shut of a costumer.

You realize you just described the infrastructure for far worse abuse than a misconfigured scraper, right?

replies(1): >>45947001 #
10. BinaryIgor ◴[] No.45946661[source]
Good points; I would definitely vouch for an independent standard for request signing + some kind of decentralized reputation system. With international law enforcement, I think there could be too many political issues for it not become corrupt
11. mrweasel ◴[] No.45947001{4}[source]
I'm very aware of that, yes. There needs to be a good process, the current situation where AWS simply does not care, or doesn't know also isn't particularly good. One solution could be for victims to notify AWS that a number of specified IP are generating an excessive amount of traffic. An operator could then verify with AWS traffic logs, notify the customer that they are causing issue and only after a failure to respond could the customer be shut down.

You're not wrong that abuse would be a massive issue, but I'm on the other side of this and need Amazon to do something, anything.

12. beeflet ◴[] No.45947147{3}[source]
I don't want governments to have this level of control over the internet. It seems like you are paving over a technological problem with the way the internet is designed by giving some institution a ton of power over the internet.
replies(1): >>45948248 #
13. marginalia_nu ◴[] No.45948248{4}[source]
The alternative to governments stopping misbehavior is every website hiding behind Cloudflare or a small number of competitors, which is a situation that is far more susceptible to abuse than having a law that says you can't DDoS people even if you live in Singapore.

It really can not be overstated how unsustainable the status quo is.

replies(1): >>45949871 #
14. beeflet ◴[] No.45949871{5}[source]
I think the alternative is to recreate the internet with more p2p friendly infrastructure. BitTorrent does not have this same DDoS problem. Mesh networks are designed with sybil resistance in mind
replies(1): >>45952200 #
15. marginalia_nu ◴[] No.45952200{6}[source]
The internet already is p2p infrastructure.

BitTorrent is just as susceptible to this, it's just there's currently no economic incentive to try to exhaustively scrape it from 50,000 VPS nodes.

replies(1): >>45958930 #
16. beeflet ◴[] No.45958930{7}[source]
>The internet already is p2p infrastructure.

No, it really isn't. Unless you mean like on the BGP level. But it's p2p in the sense where you have to trust every party not to break the system. It's like email or mastodon, it doesn't solve the fundamental sybil problem at hand.

>BitTorrent is just as susceptible to this,

In bittorrent things are hosted by adhoc users are that are roughly proportional to the number of downloaders. It is not unimaginable that you could staple a reputation system on top of it like PTs already do.