←back to thread

159 points botanica_labs | 5 comments | | HN request time: 0.83s | source
Show context
mmsc ◴[] No.45670037[source]
>after having received a lukewarm and laconic response from the HackerOne triage team.

A slight digression but lol, this is my experience with all of the bug bounty platforms. Reporting issues which are actually complicated or require an in depth understanding of technology are brickwalled, because reports of difficult problems are written for .. people who understand difficult problems and difficult technology. The runarounds are not worth the time for people who try to solve difficult problems because they have better things to do.

At least cloudflare has a competent security team that can step in and say "yeah, we can look into this because we actually understand our whole technology". It's sad that to get through to a human on these platforms you have to effectively write two reports: one for the triagers who don't understand the technology at all, and one for the competent people who actually know what they're doing.

replies(5): >>45670153 #>>45670225 #>>45670462 #>>45672569 #>>45672910 #
cedws ◴[] No.45670153[source]
IMO it’s no wonder companies keep getting hacked when doing the right thing is made so painful and the rewards are so meagre. And that’s assuming that the company even has a responsible disclosure program or you risk putting your ass on the line.

I don’t like bounty programs. We need Good Samaritan laws that legally protect and reward white hats. Rewards that pay the bills and not whatever big tech companies have in their couch cushions.

replies(3): >>45670437 #>>45670670 #>>45671921 #
1. lenerdenator ◴[] No.45670670[source]
> IMO it’s no wonder companies keep getting hacked when doing the right thing is made so painful and the rewards are so meagre.

Show me the incentives, and I'll show you the outcomes.

We really need to make security liabilities to be just that: liabilities. If you are running 20+ year-old code, and you get hacked, you need to be fined in a way that will make you reconsider security as a priority.

Also, you need to be liable for all of the disruption that the security breach caused for customers. No, free credit monitoring does not count as recompense.

replies(2): >>45671704 #>>45672156 #
2. dpoloncsak ◴[] No.45671704[source]
I love this idea, but I feel like it just devolves into ways to classify that 'specific exploit' is/isn't technically a 0-day, so they can/can't be held liable
3. akerl_ ◴[] No.45672156[source]
Why?

Why is it inherently desirable that society penalize companies that get hacked above and beyond people choosing not to use their services, or selling off their shares, etc?

replies(1): >>45673737 #
4. lenerdenator ◴[] No.45673737[source]
Because they were placed in a position of trust and failed. Typically, the failure stems from a lack of willingness to expend the resources necessary to prevent the failure.

It'd be one thing if these were isolated incidents, but they're not.

Furthermore, the methods you mention simply aren't effective. Our economy is now so consolidated that many markets only have a handful of participants offering goods or services, and these players often all have data and computer security issues. As for divestiture, most people don't own shares, and those who do typically don't know they own shares of a specific company. Most shareholders in the US are retirement or pension funds, and they are run by people who would rather make it impossible for the average person to bring real consequences to their holdings for data breaches, than cause the company to spend money on fixing the issues that allow for the breaches to begin with. After all, it's "cheaper".

replies(1): >>45674122 #
5. akerl_ ◴[] No.45674122{3}[source]
I feel like this kind of justification comes up every time this topic is on HN: that the reason companies aren't being organically penalized for bad IT/infosec/privacy behavior is because the average person doesn't have leverage or alternatives.

It's never made sense to me.

I can see that being true in specific instances: many people in the US don't have great mobility for residential ISPs, or utility companies. And there's large network effects for social media platforms. But if any significant plurality of users cared about the impact of service breaches, or bad privacy policies, surely we'd see the impact somewhere in the market? We do in some related areas: Apple puts a ton of money into marketing about keeping people's data and messages private. WhatsApp does the same. But there are so many companies out there, lots of them have garbage security practices, lots of them get compromised, and I'm struggling to remember any example of a consumer company that had a breach and saw any significant impact.

To pick an example: in 2014 Home Depot had a breach of payment data. Basically everywhere that has Home Depots also has Lowes and other options that sell the same stuff. In most places, if you're pissed at Home Depot for losing your card information, you can literally drive across the street to Lowes. But it doesn't seem like that happened.

Is it possible that outside of tech circles where we care about The Principle Of The Thing, the market is actually correct in its assessment of the value for the average consumer business of putting more money into security?