Update: obviously I just skimmed this, per responses below.
> During our conversation, the Cerca team acknowledged the seriousness of these issues, expressed gratitude for the responsible disclosure, and assured me they would promptly address the vulnerabilities and inform affected users.
Well that was the decent thing to do and they did it. Beyond that it is their internal problem and, especially they did fix the issue according to the article.
Engineers can be a little too open and naive. Perhaps his first contacts was with the technical team but then managament and the legal team got hold of the issue and shut it off.
> Well that was the decent thing to do and they did it. Beyond that it is their internal problem and, especially they did fix the issue according to the article.
They didn't inform anyone, as far as I can tell. Especially users need(ed) to be informed.
It's also at least good practice to let security researchers know schedule of when it's safe to inform the public, otherwise in the future disclosure will be chaotic.
* Port 443 exposed to the internets. This can allow attackers to gain access to information you have. $10k fee for discovery
* Your port 443 responds with "Server: AmazonS3" header. This can allow attackers to identify your hosting company. $10k fee for discovery.
Please remit payment and we will offer instructions for remediation.
Not clear why "the public" should be informed, either.
Ultimately they thanked the researcher and fixed the issue, job done.
https://portal.ct.gov/ag/sections/privacy/reporting-a-data-b...
I doubt this is an engineering team’s naivete meeting a rational legal team’s response. I’d guess it’s rather facing marketing or management naivete that sticking your head in the sand is the correct way to deal with a potential data leak story.
Then you have no duty to report the vuln to the company and instead should feel free to disclose it to the world.
A little politeness goes a long ways on both sides.
Because it's the law in some states now.
Furthermore mandated reporting requirements is how you keep companies from making stupid security decisions in the first place. Mishandling data this way should be a business ending event.
They who wrote up an API with extremely basic security flaws, and didn't know until someone came and told them. Let's be honest: they have _no_ idea if anyone's data was breached. Users should know so they can be extra cautious, the data in question can ruin lives.
> Not clear why "the public" should be informed, either.
The public will be informed because why would the security researcher keep quiet? They also _should_ know because it's important information for someone considering trusting that company with sensitive information.
> Ultimately they thanked the researcher and fixed the issue, job done.
Hard disagree. It's not the worst possible response, but it's not good and it wasn't done.