←back to thread

560 points bearsyankees | 8 comments | | HN request time: 1.111s | source | bottom
Show context
xutopia ◴[] No.43965126[source]
That's crazy to not have responded to his repeated requests!
replies(3): >>43965190 #>>43965227 #>>43965306 #
1. mytailorisrich ◴[] No.43965306[source]
A company has no duty to report to you about just because you kindly notified them of a vulnerability in their software.

> During our conversation, the Cerca team acknowledged the seriousness of these issues, expressed gratitude for the responsible disclosure, and assured me they would promptly address the vulnerabilities and inform affected users.

Well that was the decent thing to do and they did it. Beyond that it is their internal problem and, especially they did fix the issue according to the article.

Engineers can be a little too open and naive. Perhaps his first contacts was with the technical team but then managament and the legal team got hold of the issue and shut it off.

replies(2): >>43965406 #>>43967411 #
2. kadoban ◴[] No.43965406[source]
> > During our conversation, the Cerca team acknowledged the seriousness of these issues, expressed gratitude for the responsible disclosure, and assured me they would promptly address the vulnerabilities and inform affected users.

> Well that was the decent thing to do and they did it. Beyond that it is their internal problem and, especially they did fix the issue according to the article.

They didn't inform anyone, as far as I can tell. Especially users need(ed) to be informed.

It's also at least good practice to let security researchers know schedule of when it's safe to inform the public, otherwise in the future disclosure will be chaotic.

replies(2): >>43966051 #>>43966062 #
3. mytailorisrich ◴[] No.43966051[source]
Companies won't inform of vulnerabilities. They may/should inform users if they think their data was breached, which is different.

Not clear why "the public" should be informed, either.

Ultimately they thanked the researcher and fixed the issue, job done.

replies(2): >>43967425 #>>43968950 #
4. sakjur ◴[] No.43966062[source]
Taking Yale as a starting point, they seem to have failed their legal obligation to inform their Conneticut users within 60 days (assuming the author of the post would’ve received a copy of such a notification).

https://portal.ct.gov/ag/sections/privacy/reporting-a-data-b...

I doubt this is an engineering team’s naivete meeting a rational legal team’s response. I’d guess it’s rather facing marketing or management naivete that sticking your head in the sand is the correct way to deal with a potential data leak story.

5. pixl97 ◴[] No.43967411[source]
>A company has no duty to report to you about just because you kindly notified them of a vulnerability in their software.

Then you have no duty to report the vuln to the company and instead should feel free to disclose it to the world.

A little politeness goes a long ways on both sides.

6. pixl97 ◴[] No.43967425{3}[source]
>Not clear why "the public" should be informed, either.

Because it's the law in some states now.

Furthermore mandated reporting requirements is how you keep companies from making stupid security decisions in the first place. Mishandling data this way should be a business ending event.

replies(1): >>43967681 #
7. autoexec ◴[] No.43967681{4}[source]
Instead it seems like business as usual. Without laws with teeth sharp enough to hurt it'll just continue to be like this.
8. kadoban ◴[] No.43968950{3}[source]
> Companies won't inform of vulnerabilities. They may/should inform users if they think their data was breached, which is different.

They who wrote up an API with extremely basic security flaws, and didn't know until someone came and told them. Let's be honest: they have _no_ idea if anyone's data was breached. Users should know so they can be extra cautious, the data in question can ruin lives.

> Not clear why "the public" should be informed, either.

The public will be informed because why would the security researcher keep quiet? They also _should_ know because it's important information for someone considering trusting that company with sensitive information.

> Ultimately they thanked the researcher and fixed the issue, job done.

Hard disagree. It's not the worst possible response, but it's not good and it wasn't done.