←back to thread

1125 points CrankyBear | 1 comments | | HN request time: 0s | source
Show context
woodruffw ◴[] No.45891521[source]
I’m an open source maintainer, so I empathize with the sentiment that large companies appear to produce labor for unpaid maintainers by disclosing security issues. But appearance is operative: a security issue is something that I (as the maintainer) would need to fix regardless of who reports it, or would otherwise need to accept the reputational hit that comes with not triaging security reports. That’s sometimes perfectly fine (it’s okay for projects to decide that security isn’t a priority!), but you can’t have it both ways.
replies(13): >>45891613 #>>45891749 #>>45891930 #>>45892032 #>>45892263 #>>45892941 #>>45892989 #>>45894805 #>>45896179 #>>45897077 #>>45897316 #>>45898926 #>>45900786 #
Msurrow ◴[] No.45891613[source]
My takeaway from the article was not that the report was a problem, but a change in approach from Google that they’d disclose publicly after X days, regardless of if the project had a chance to fix it.

To me its okay to “demand” from a for profit company (eg google) to fix an issue fast. Because they have ressources. But to “demand” that an oss project fix something with a certain (possibly tight) timeframe.. well I’m sure you better than me, that that’s not who volunteering works

replies(5): >>45891699 #>>45891755 #>>45891844 #>>45893088 #>>45898343 #
vadansky ◴[] No.45891699[source]
On the other hand as an ffmpeg user do you care? Are you okay not being told a tool you're using has a vulnerability in it because the devs don't have time to fix it? I mean someone could already be using the vulnerability regardless of what Google does.
replies(9): >>45891756 #>>45891762 #>>45891975 #>>45892022 #>>45892359 #>>45892632 #>>45893251 #>>45895054 #>>45900572 #
afiori ◴[] No.45891975[source]
This is a fantastic argument for the universe where Google does not disclose vulnerability until the maintainers had had reasonable time to fix it.

In this world the user is left vulnerable because attackers can use published vulnerabilities that the maintainers are to overwhelmed to fix

replies(3): >>45892177 #>>45892296 #>>45896986 #
toast0 ◴[] No.45892296[source]
The user is vulnerable while the problem is unfixed. Google publishing a vulnerability doesn't change the existence of the vulnerability. If Google can find it, so can others.

Making the vulnerability public makes it easy to find to exploit, but it also makes it easy to find to fix.

replies(4): >>45892602 #>>45892899 #>>45893864 #>>45895842 #
BobaFloutist ◴[] No.45893864[source]
>If Google can find it, so can others.

What a strange sentence. Google can do a lot of things that nobody can do. The list of things that only Google, a handful of nation states, and a handful of Google-peers can do is probably even longer.

replies(2): >>45894096 #>>45896995 #
1. toast0 ◴[] No.45894096{3}[source]
Sure, but running a fuzzer on ancient codecs isn't that special. I can't do it, but if I wanted to learn how, codecs would be a great place to start. (in fact, Google did some of their early fuzzing work in 2012-2014 on ffmpeg [1]) Media decoders have been the vector for how many zero interaction, high profile attacks lately? Media decoders were how many of the Macromedia Flash vulnerabilities? Codecs that haven't gotten any new media in decades but are enabled in default builds are a very good place to go looking for issues.

Google does have immense scale that makes some things easier. They can test and develop congestion control algorithms with world wide (ex-China) coverage. Only a handful of companies can do that; nation states probably can't. Google isn't all powerful either, they can't make Android updates really work even though it might be useful for them.

[1] https://security.googleblog.com/2014/01/ffmpeg-and-thousand-...