Most active commenters

    ←back to thread

    561 points bearsyankees | 15 comments | | HN request time: 1.345s | source | bottom
    1. xutopia ◴[] No.43965126[source]
    That's crazy to not have responded to his repeated requests!
    replies(3): >>43965190 #>>43965227 #>>43965306 #
    2. benzible ◴[] No.43965190[source]
    As someone managing a relatively low-profile SaaS app, I get constant reports from "security researchers" who just ran automated vulnerability scanners and are seeking bounties on minor issues. That said, it's inexcusable - they absolutely need to take these reports seriously and distinguish between scanner spam and legitimate security research like this.

    Update: obviously I just skimmed this, per responses below.

    replies(3): >>43965283 #>>43965303 #>>43965441 #
    3. moonlet ◴[] No.43965227[source]
    Not really if they don’t have any security or even devsecops yet… if they just have devs and those devs are people who are relatively junior / just out of school, I could unfortunately absolutely see this happening
    4. bee_rider ◴[] No.43965283[source]
    It sounds like they actually met with him, patched the issues, and then didn’t respond afterwards. IMO that is quite rude of them toward him, but they do seem to have taken the issue itself somewhat seriously.
    replies(1): >>43965404 #
    5. sshine ◴[] No.43965303[source]
    They already met with him and acknowledged the problem. So their lack of follow-up is an attempt to push things under the rug. Users deserve to know that their data was compromised. In some places of the world it is a crime to not report a data leak.
    6. mytailorisrich ◴[] No.43965306[source]
    A company has no duty to report to you about just because you kindly notified them of a vulnerability in their software.

    > During our conversation, the Cerca team acknowledged the seriousness of these issues, expressed gratitude for the responsible disclosure, and assured me they would promptly address the vulnerabilities and inform affected users.

    Well that was the decent thing to do and they did it. Beyond that it is their internal problem and, especially they did fix the issue according to the article.

    Engineers can be a little too open and naive. Perhaps his first contacts was with the technical team but then managament and the legal team got hold of the issue and shut it off.

    replies(2): >>43965406 #>>43967411 #
    7. benzible ◴[] No.43965404{3}[source]
    Ah, sorry, I need to actually read things before I react :)
    8. kadoban ◴[] No.43965406[source]
    > > During our conversation, the Cerca team acknowledged the seriousness of these issues, expressed gratitude for the responsible disclosure, and assured me they would promptly address the vulnerabilities and inform affected users.

    > Well that was the decent thing to do and they did it. Beyond that it is their internal problem and, especially they did fix the issue according to the article.

    They didn't inform anyone, as far as I can tell. Especially users need(ed) to be informed.

    It's also at least good practice to let security researchers know schedule of when it's safe to inform the public, otherwise in the future disclosure will be chaotic.

    replies(2): >>43966051 #>>43966062 #
    9. nick238 ◴[] No.43965441[source]
    Pardon sir, I see you have:

    * Port 443 exposed to the internets. This can allow attackers to gain access to information you have. $10k fee for discovery

    * Your port 443 responds with "Server: AmazonS3" header. This can allow attackers to identify your hosting company. $10k fee for discovery.

    Please remit payment and we will offer instructions for remediation.

    10. mytailorisrich ◴[] No.43966051{3}[source]
    Companies won't inform of vulnerabilities. They may/should inform users if they think their data was breached, which is different.

    Not clear why "the public" should be informed, either.

    Ultimately they thanked the researcher and fixed the issue, job done.

    replies(2): >>43967425 #>>43968950 #
    11. sakjur ◴[] No.43966062{3}[source]
    Taking Yale as a starting point, they seem to have failed their legal obligation to inform their Conneticut users within 60 days (assuming the author of the post would’ve received a copy of such a notification).

    https://portal.ct.gov/ag/sections/privacy/reporting-a-data-b...

    I doubt this is an engineering team’s naivete meeting a rational legal team’s response. I’d guess it’s rather facing marketing or management naivete that sticking your head in the sand is the correct way to deal with a potential data leak story.

    12. pixl97 ◴[] No.43967411[source]
    >A company has no duty to report to you about just because you kindly notified them of a vulnerability in their software.

    Then you have no duty to report the vuln to the company and instead should feel free to disclose it to the world.

    A little politeness goes a long ways on both sides.

    13. pixl97 ◴[] No.43967425{4}[source]
    >Not clear why "the public" should be informed, either.

    Because it's the law in some states now.

    Furthermore mandated reporting requirements is how you keep companies from making stupid security decisions in the first place. Mishandling data this way should be a business ending event.

    replies(1): >>43967681 #
    14. autoexec ◴[] No.43967681{5}[source]
    Instead it seems like business as usual. Without laws with teeth sharp enough to hurt it'll just continue to be like this.
    15. kadoban ◴[] No.43968950{4}[source]
    > Companies won't inform of vulnerabilities. They may/should inform users if they think their data was breached, which is different.

    They who wrote up an API with extremely basic security flaws, and didn't know until someone came and told them. Let's be honest: they have _no_ idea if anyone's data was breached. Users should know so they can be extra cautious, the data in question can ruin lives.

    > Not clear why "the public" should be informed, either.

    The public will be informed because why would the security researcher keep quiet? They also _should_ know because it's important information for someone considering trusting that company with sensitive information.

    > Ultimately they thanked the researcher and fixed the issue, job done.

    Hard disagree. It's not the worst possible response, but it's not good and it wasn't done.