Most active commenters
  • 9283409232(3)

←back to thread

561 points bearsyankees | 32 comments | | HN request time: 0.638s | source | bottom
1. edm0nd ◴[] No.43965336[source]
> I have been met with radio silence.

Thats when its time to inform them you are dumping the vuln to the public in 90 days due to their silence.

replies(3): >>43965359 #>>43965374 #>>43965518 #
2. 9283409232 ◴[] No.43965359[source]
Good way to get yourself sued and have possible criminal charges brought up to you.
replies(3): >>43965376 #>>43965385 #>>43965884 #
3. hbn ◴[] No.43965374[source]
That's more of a punishment to innocent users than the business
replies(3): >>43965381 #>>43965519 #>>43966199 #
4. b8 ◴[] No.43965376[source]
Which has never happened before and if it does then the EFF would back you presumably.
replies(2): >>43965442 #>>43965504 #
5. kenjackson ◴[] No.43965381[source]
True. Maybe let them know you will be directly contacting each user and letting them know that this service has exposed their personal information to hackers.
replies(1): >>43965582 #
6. edm0nd ◴[] No.43965385[source]
most certainly not (at least in the US).

I'm so tired of researchers being ignored when they bring a serious vuln to a company to be met with silence and/or resistance on top of them never alerting their users about it.

7. 9283409232 ◴[] No.43965442{3}[source]
This is a completely uninformed comment. Security researchers get sued or threatened all the time. Bunnie was threatened by Microsoft for publishing his research on Xbox vulnerabilities, the city of Columbus sued David Ross for his reporting on data exposed during a ransomware attack, Google has threatened action against a few security researchers if memory serves and that is just what I can remember off the top of my head.
replies(4): >>43965559 #>>43965722 #>>43965731 #>>43965873 #
8. chickenzzzzu ◴[] No.43965504{3}[source]
Imagine banking your physical and financial security on a presumption that the EFF can help you XD
9. OutOfHere ◴[] No.43965518[source]
There is no vulnerability here. It's just out in the open.
replies(2): >>43966079 #>>43966128 #
10. nick238 ◴[] No.43965519[source]
Disclosure is good for the 'innocent users' as they are made aware that their data may have been leaked (who knows if the company can do the sufficient auditing and forensics to detect total scraping), rather than just being oblivious because the company just didn't bother to tell them.
replies(2): >>43966025 #>>43966247 #
11. secalex ◴[] No.43965559{4}[source]
Agreed. I've been doing this for 25+ years and personally know a dozen people who have been threatened and several who have been sued or faced potential prosecution for legitimate security research. I've experienced both situations!

That doesn't make it right, and the treatment of the researcher here was completely inappropriate, but telling young researchers to just go full disclosure without being careful about documentation, legal advice and staying within the various legal lines is itself irresponsible.

12. nick238 ◴[] No.43965582{3}[source]
I'd definitely not do that. POCing a scraper to check is fine, but you shouldn't save any PII from that data. You're also saying you're the "hacker", as you don't know if it's actually been revealed to others without the forensics that (hopefully) only the business can do.
replies(1): >>43967496 #
13. tptacek ◴[] No.43965722{4}[source]
I've spent my entire career doing this, have been personally "threatened" several times, and until relatively recently kept track of researchers dealing with legal threats. The concern is overblown. In cases that go beyond a nastygram from a lawyer, it is almost always the fact pattern that some aggravating factor is present: a consulting agreement that initiated the testing and forecloses disclosure, or the preservation and/or publication of the PII itself, or attempts to pivot and persist access after finding a vulnerability.

It's an especially superficial argument on this story, where the underlying vulnerability has essentially already been disclosed.

replies(1): >>43965834 #
14. retrac ◴[] No.43965731{4}[source]
The government of Nova Scotia, Canada used to host its FOIA releases (similar to American freedom of information laws) on a website, with a URL along the lines of server.example.gov.ns.ca/foiadoc?=00031337

They are public and intended to be publicly accessed. A clever teenager [1] noticed -- hey, is that a sequential serial number? Well, yes it was. And so he downloaded all the FOIA documents. Well it turns out they aren't public. The government hosted all the FOIA documents that way, including self-disclosures (which include sensitive information and are only released to the person who the information is about). They never intended to publicly release a small subset of those URLs. (Even though they were transparently guessable.)

Unauthorized access of a computer system carries up to 10 years in prison. The charges were eventually dropped [2] and I don't think a conviction was ever likely. Poor fellow still went through the whole process of being dragged out of bed by armed police.

[1] https://www.cbc.ca/news/canada/nova-scotia/freedom-of-inform...

[2] https://www.techdirt.com/2018/05/08/police-drop-charges-file...

replies(2): >>43965775 #>>43966581 #
15. koakuma-chan ◴[] No.43965775{5}[source]
Why did they charge the teen and not the government of NS?
replies(1): >>43968295 #
16. secalex ◴[] No.43965834{5}[source]
Depending on what he actually did to enumerate that database and whether he downloaded all that PII I think changes the risk profile.
17. tgsovlerkhgsel ◴[] No.43965873{4}[source]
Threats with the goal to prevent publication are incredibly common.

Following up on the threat is much less common, and the best way to prevent that (IMO) is to remove the motivation to do so: Once the vuln is public and further threats can not prevent the publication, just draw more negative attention to the company, the company has much fewer incentives to threaten or follow up on threats already made.

It's not a guarantee, you can always hit a vindicative and stupid business owner, but usually publishing in response to threats isn't just the right thing to do (to discourage such attempts) but also the smart thing to do (to protect yourself).

18. Buttons840 ◴[] No.43965884[source]
Yeah. Security researchers face the threat of lawsuits constantly, while those who build insecure apps face no consequences.

We are literally sacrificing national security for the convenience of wealthy companies.

replies(1): >>43966905 #
19. maxverse ◴[] No.43966025{3}[source]
Is there any reason to not just privately email the users? "Hey, I'm so and so, a security researcher. I was able to gather your data from <Company>, which has not responded to any inquiries from me. Please be aware that your data is mismanaged and vulnerable, and I encourage you to voice your concern directly to <Company>."
replies(2): >>43966882 #>>43967667 #
20. myself248 ◴[] No.43966079[source]
Imagine if they tried to claim that. "Everything was just out on the front lawn, you can't blame us for not locking the door because we didn't even have a door!"
21. ◴[] No.43966128[source]
22. ericmcer ◴[] No.43966199[source]
This is a rare case where the leak is so egregious he could actually reach out to all the users themselves to let them know. Especially the ones with passport info.
23. kube-system ◴[] No.43966247{3}[source]
> Disclosure is good for the 'innocent users' as they are made aware that their data may have been leaked

Presuming perfect communication which is never the case for security vulnerabilities on a consumer application.

24. uneekname ◴[] No.43966581{5}[source]
Genuine question, how could a well-formed HTTP request for a URL ever be considered unauthorized access? If I request something and someone responds...shouldn't it be their responsibility not to share important information?

Edit: should have read the linked article before commenting. It totally wasn't, and the charges were dropped...after thoroughly harassing the kid.

replies(1): >>43967505 #
25. Ajedi32 ◴[] No.43966882{4}[source]
Seems like a reasonable idea, though depending on how many users are affected that may effectively amount to going public. Also only works if the vulnerability gives you access to all customer emails, and you're willing to exploit it to get that info (which might not be a good idea legally speaking).
26. SoftTalker ◴[] No.43966905{3}[source]
Well it's kind of like "I walked around the neighborhood trying everyone's front door, I found one unlocked and I could even enter the house and rummage through their personal effects. Just trying to improve the security of the neighborhood!"
replies(2): >>43966936 #>>43967682 #
27. Buttons840 ◴[] No.43966936{4}[source]
Yes, but the house also has like 250 million people's precious possessions inside, including your own. And foreigners who are not subject to our laws are testing the door constantly. Yes, in this situation it would be like 1 honest researcher also approaching to test the door--seems fine to me.

On second thought, maybe physical buildings are not a good analogy.

28. kenjackson ◴[] No.43967496{4}[source]
Yeah. Not good practical advice on my part.
29. Alex-Programs ◴[] No.43967505{6}[source]
The mental and moral model used by programmers ("you own the backend; I own the frontend; if your backend returns stupid stuff to the frontend without me actively breaking into it, that's your fault") is not, as far as I can tell, shared by broader society.
30. yard2010 ◴[] No.43967667{4}[source]
Make it better: find a lawyer that would sue, send them the details, you can find like 10 ppl out of 10k who would love to sue, you get your bounty from the lawyer.
31. yard2010 ◴[] No.43967682{4}[source]
If you keep PII of 10k people in your house - LOCK YOUR GODDAMN DOOR
32. 9283409232 ◴[] No.43968295{6}[source]
Why did the government of Nova Scotia not charge itself?