←back to thread

81 points impish9208 | 10 comments | | HN request time: 0.47s | source | bottom
1. hn_throwaway_99 ◴[] No.41916731[source]
It's amusing to me how the economic and cultural incentives at so many companies is to lie as much as possible when it comes to breach disclosures while pretending that you're still technically telling the truth.

I think that in all of these cases it would have been no worse for the companies in question if they just sent out a dry, "just the facts, ma'am" report of what actually happened, without any of the BS "the security of our customer data is our primary priority!" statements to begin with that always accompany these kinds of breach disclosures. E.g. something like:

On <date>, due to a vulnerability in the third party vendor SolarWinds which provides network security services for us, we detected the following breaches of customer data:

1. xxx

2. yyy

The steps we are currently taking, and what you should do: zzz.

----

Perhaps one good thing that can come out of this is that some sort of "standard" format for breach disclosures comes about (think the "Nutrition Facts" labels on food boxes in the US). All I do when I see companies trying to minimize breach disclosures is assume they're bullshitting anyway.

replies(3): >>41918007 #>>41918463 #>>41918586 #
2. kmeisthax ◴[] No.41918007[source]
If companies were mere profit-seeking entities, these breach notices would be minimally disruptive to the business. Most people do not immediately jump ship just because a breach happened.

But most companies are not just that. They're barely-legal Ponzi schemes. The board and their appointed CxOs are selected specifically on the basis of how much they can get the stock price up. This results in companies making lots of terribly short-sighted decisions.

In the specific case of breach disclosures, any bad news about a company tends to create uncertainty, which makes short-term investors and speculators close their positions, which drops the price. This drop tends to be short-term, but it imperils the liquidity of the investment, and liquid investments tend to be more valuable, so...

replies(2): >>41918660 #>>41921196 #
3. JumpCrisscross ◴[] No.41918463[source]
> in all of these cases it would have been no worse for the companies in question if they just sent out a dry, "just the facts, ma'am" report of what actually happened

This assumes there is someone on staff capable of writing a no-nonsense diagnosis.

replies(1): >>41918691 #
4. SpicyLemonZest ◴[] No.41918586[source]
I'm sympathetic, but I feel like the order against Mimecast illustrates a big part of the problem here. This seems to me like a pretty detailed disclosure:

> The investigation revealed that the threat actor accessed and downloaded a limited number of our source code repositories, as the threat actor is reported to have done with other victims of the SolarWinds Orion supply chain attack. We believe that the source code downloaded by the threat actor was incomplete and would be insufficient to build and run any aspect of the Mimecast service. We found no evidence that the threat actor made any modifications to our source code nor do we believe that there was any impact on our products. We will continue to analyze and monitor our source code to protect against potential misuse.

But the SEC feels this was misleading, because they did not specify which source code repositories were targeted or what percentage of the code in those repositories was exfiltrated. That's the dynamic that drives these kind of disclosures, oversharing driving demands for even more absurd levels of oversharing. They had to go calculate that precisely 76% of their M365 interop code was exfiltrated - is that information worth the cost of producing it, or even valuable to anyone in any way?

replies(2): >>41919211 #>>41919270 #
5. TeMPOraL ◴[] No.41918660[source]
Thanks, that does explain the long-standing conundrum I had. Having worked for a cybersec/GRC startup in the past[0], I got a good look at how risks and their impacts are categorized, but I still couldn't figure out, why does anyone care.

Like, "reputational damage", obviously nobody cares if a company gets breached - 99% of the customers won't notice, 99% of the remaining won't understand it, and the competitors are probably just as much at risk; all you need to do is issue some PR note and maybe offer free credit monitoring (some US peculiarity), and you're done. Same for most other things leading to "reputational damage". It feels like it's obviously a loss of $nothing, so why do CFOs and CISOs seem to put so much interest in this impact category?

Well, I haven't thought about stock prices, and their lack of correlation with customer experience. My bad.

--

[0] - I suppose I had all the things I needed to figure it out, somehow I didn't connect the dots. And/or was too busy trying to ensure our fancy probability math wasn't bullshit to pay attention to the larger context.

6. TeMPOraL ◴[] No.41918691[source]
Sure there are. The person writing the release gets fed some internal bullet points or summaries as source material; that material is strictly less bullshit than the resulting official press release.
7. Veserv ◴[] No.41919211[source]
You do not need to say precisely 76%. Nobody is going to complain if you spend less resources to get a less strict upper bound like 80%. Hell, you can make it easy for yourself and just say 100%; costs nothing and guaranteed to not understate the customer impact. The problem is deceptively implying less customer impact.

But no company will deliberately overstate the customer impact, think of what it would do to their bottom lines. They much prefer spending a bunch of money to minimize overstating. Exactly.

If only they were allowed to understate customer impact then they could harvest even more of that reputational arbitrage is not a very compelling justification.

8. notatoad ◴[] No.41919270[source]
>is that information worth the cost of producing it, or even valuable to anyone in any way?

It's valuable to the SEC, because they're the ones tasked with enforcing these rules and specifics are what allow for enforcement. If you publish an actual percentage, then they can ding you for lying if the percentage was wrong. being vague isn't misleading on its own, but it can be used to be misleading.

if they actually know what was exfiltrated, then putting specifics in the disclosure should be a trivial matter. maybe not a percentage of lines in the codebase, but you've got to give the SEC enough that they could potentially check it and determine if it was a lie. and "a limited number" isn't specific enough to do that.

replies(1): >>41919903 #
9. SpicyLemonZest ◴[] No.41919903{3}[source]
I don't agree that the Securities and Exchange Commission is tasked with enforcing good cybersecurity disclosures. You'll note that this settlement formally has to do with the companies' statements to investors, although I agree with the implicit assumption a lot of people upthread are making, that the charges would not have been filed if customer disclosures were adequate.
10. gruez ◴[] No.41921196[source]
>But most companies are not just that. They're barely-legal Ponzi schemes. The board and their appointed CxOs are selected specifically on the basis of how much they can get the stock price up. This results in companies making lots of terribly short-sighted decisions.

"Most companies are ponzi schemes focused on short term stock price appreciation" is a criticism that has been around for decades. If that's really the case, the performance of the s&p 500 shows that it's either false, or a really long con that somehow still hasn't collapsed yet.

A far more straightforward explanation is that CEOs don't like delivering bad news, especially ones that happened on their watch, so they try to bury it. Covering up mistakes is something that kids even do. There's no need to invoke "most companies are [...] barely-legal Ponzi schemes"