Most active commenters
  • danaris(3)
  • chii(3)
  • samus(3)

←back to thread

1125 points CrankyBear | 38 comments | | HN request time: 0.001s | source | bottom
Show context
woodruffw ◴[] No.45891521[source]
I’m an open source maintainer, so I empathize with the sentiment that large companies appear to produce labor for unpaid maintainers by disclosing security issues. But appearance is operative: a security issue is something that I (as the maintainer) would need to fix regardless of who reports it, or would otherwise need to accept the reputational hit that comes with not triaging security reports. That’s sometimes perfectly fine (it’s okay for projects to decide that security isn’t a priority!), but you can’t have it both ways.
replies(13): >>45891613 #>>45891749 #>>45891930 #>>45892032 #>>45892263 #>>45892941 #>>45892989 #>>45894805 #>>45896179 #>>45897077 #>>45897316 #>>45898926 #>>45900786 #
Msurrow ◴[] No.45891613[source]
My takeaway from the article was not that the report was a problem, but a change in approach from Google that they’d disclose publicly after X days, regardless of if the project had a chance to fix it.

To me its okay to “demand” from a for profit company (eg google) to fix an issue fast. Because they have ressources. But to “demand” that an oss project fix something with a certain (possibly tight) timeframe.. well I’m sure you better than me, that that’s not who volunteering works

replies(5): >>45891699 #>>45891755 #>>45891844 #>>45893088 #>>45898343 #
1. Lerc ◴[] No.45891755[source]
That is standard practice. It is considered irresponsible to not publicly disclose any vulnerability.

The X days is a concession to the developers that the public disclosure will be delayed to give them an opportunity to address the issue.

replies(3): >>45891840 #>>45891994 #>>45892271 #
2. SpicyLemonZest ◴[] No.45891840[source]
The entire conflict here is that norms about what's considered responsible were developed in a different context, where vulnerability reports were generated at a much lower rate and dedicated CVE-searching teams were much less common. FFmpeg says this was "AI generated bug reports on an obscure 1990s hobby codec"; if that's accurate (I have no reason to doubt it, just no time to go check), I tend to agree that it doesn't make sense to apply the standards that were developed for vulnerabilities like "malicious PNG file crashes the computer when loaded".
replies(4): >>45891940 #>>45892158 #>>45897036 #>>45898153 #
3. adastra22 ◴[] No.45891940[source]
It is accurate. This is a codec that was added for archival and digital preservation purposes. It’s like adding a Unicode block for some obscure 4000 year old dead language that we have a scant half dozen examples of writing.
4. johneth ◴[] No.45891994[source]
> That is standard practice.

It's standard practice for commercially-sponsored software, and it doesn't necessarily fit volunteer maintained software. You can't have the same expectations.

replies(4): >>45892824 #>>45895452 #>>45895841 #>>45901193 #
5. Lerc ◴[] No.45892158[source]
I think the discussion on what standard practice should be does need to be had. This seems to be throwing blame at people following the current standard.

If the obscure coded is not included by default or cannot be triggered by any means other than being explicitly asked for, then it would be reasonable to tag it Won't Fix. If it can be triggered by other means, such as auto file type detection on a renamed file, then it doesn't matter how obscure the feature is, the exploit would affect all.

What is the alternative to a time limited embargo. I don't particularly like the idea of groups of people having exploits that they have known about for ages that haven't been publicly disclosed. That is the kind of information that finds itself in the wrong hands.

Of course companies should financially support the developers of the software they depend upon. Many do this for OSS in the form of having a paid employee that works on the project.

Specifically, FFMPEG seems to have a problem that much of their limitation of resources comes from them alienating contributors. This isn't isolated to just this bug report.

replies(1): >>45894516 #
6. danaris ◴[] No.45892271[source]
Here's the question:

Why is Google deliberately running an AI process to find these bugs if they're just going to dump them all on the FFmpeg team to fix?

They have the option to pay someone to fix them.

They also have the option to not spend resources finding the bugs in the first place.

If they think these are so damn important to find that it's worth devoting those resources to, then they can damn well pay for fixing them too.

Or they can shut the hell up and let FFmpeg do its thing in the way that has kept it one of the https://xkcd.com/2347/ pieces of everyone's infrastructure for over 2 decades.

replies(5): >>45892391 #>>45892592 #>>45893424 #>>45895857 #>>45897059 #
7. freedomben ◴[] No.45892391[source]
I would love to see Google contribute here, but I think that's a different issue.

Are the bug reports accurate? If so, then they are contributing just as if I found them and sent a bug report, I'd be contributing. Of course a PR that fixes the bug is much better than just a report, but reports have value, too.

The alternative is to leave it unfound, which is not a better alternative in my opinion. It's still there and potentially exploitable even when unreported.

replies(2): >>45893120 #>>45896220 #
8. SR2Z ◴[] No.45892592[source]
Google is a significant contributor to ffmpeg by way of VP9/AV1/AV2. It's not like it's a gaping maw of open-source abuse, the company generally provides real value to the OSS ecosystem at an even lower level than ffmpeg (which is saying a lot, ffmpeg is pretty in-the-weeds already).

As to why they bother finding these bugs... it's because that's how Google does things. You don't wait for something to break or be exploited, you load your compiler up with santizers and go hunting for bugs.

Yeah this one is kind of trivial, but if the bug-finding infrastructure is already set up it would be even more stupid if Google just sat on it.

replies(1): >>45893102 #
9. NegativeK ◴[] No.45892824[source]
Vulnerabilities should be publicly disclosed. Both closed and open source software are scrutinized by the good and the bad people; sitting on vulnerabilities isn't good.

Consumers of closed source software have a pretty reasonable expectation that the creator will fix it in a timely manner. They paid money, and the (generally) the creator shouldn't put the customer in a nasty place because of errors.

Consumers of open source software should have zero expectation that someone else will fix security issues. Individuals should understand this; it's part of the deal for us using software for free. Organizations that are making money off of the work of others should have the opposite of an expectation that any vulns are fixed. If they have or should have any concern about vulnerabilities in open source software, then they need to contribute to fixing the issue somehow. Could be submitting patches, paying a contractor or vendor to submit patches, paying a maintainer to submit patches, or contributing in some other way that betters the project. The contribution they pick needs to work well with the volunteers, because some of the ones I listed would absolutely be rejected by some projects -- but not by others.

The issue is that an org like Google, with its absolute mass of technical and financial resources, went looking for security vulnerabilities in open source software with the pretense of helping. But if Google (or whoever) doesn't finish the job, then they're being a piece of shit to volunteers. The rest of the job is reviewing the vulns by hand and figuring out patches that can be accepted with absolutely minimal friction.

To your point, the beginning of the expectation should be that vulns are disclosed, since otherwise we have known insecure software. The rest of the expectation is that you don't get to pretend to do a nice thing while _knowing_ that you're dumping more work on volunteers that you profit from.

In general, wasting the time of volunteers that you're benefiting from is rude.

Specifically, organizations profiting off of volunteer work and wasting their time makes them an extractive piece of shit.

Stop being a piece of shit, Google.

replies(1): >>45896725 #
10. danaris ◴[] No.45893102{3}[source]
Then they can damn well pay for fixing them too.
replies(2): >>45893970 #>>45895472 #
11. danaris ◴[] No.45893120{3}[source]
But FFmpeg does not have the resources to fix these at the speed Google is finding them.

It's just not possible.

So Google is dedicating resources to finding these bugs

and feeding them to bad actors.

Bad actors who might, hypothetically have had the information before, but definitely do once Google publicizes them.

You are talking about an ideal situation; we are talking about a real situation that is happening in the real world right now, wherein the option of Google reports bug > FFmpeg fixes bug simply does not exist at the scale Google is doing it at.

replies(3): >>45893949 #>>45896023 #>>45898198 #
12. _flux ◴[] No.45893424[source]
Many people are already developing and fixing FFmpeg.

How many people are actively looking for bugs? Google, and then the other guys that don't share their findings, but perhaps sell them to the highest bidder. Seems like Google is doing some good work by just picking big, popular open source projects and seeing if they have bugs, even if they don't intend to fix them. And I doubt Google was actually using the Lucas Arts video format their latest findings were about.

However, in my mind the discussion whether Google should be developing FFmpeg (beyond the codec support mentioned elsewhere in the thread) or other OSS projects is completely separate from whether they should be finding bugs in them. I believe most everyone would agree they should. They are helping OSS in other ways though, e.g. https://itsfoss.gitlab.io/post/google-sponsors-1-million-to-... .

13. GabrielTFS ◴[] No.45893949{4}[source]
A solution definitely ought to be found. Google putting up a few millionths of a percent of their revenue or so towards fixing the bugs they find in ffmpeg would be the ideal solution here, certainly. Yet it seems unlikely to actually occur.

I think the far more likely result of all the complaints is that Google simply completely disengages from ffmpeg and stops doing any security work on it. I think that would be quite bad for the security of the project - if Google can trivially find bugs at a high speed such that it overwhelms the ffmpeg developers, I would imagine bad actors can also search for them and find those same vulnerabilities Google is constantly finding, and if they know that those vulnerabilities very much exist, but that Google has simply stopped searching for them upon demand of the ffmpeg project, this would likely give them extremely high motivation to go looking in a place they can be almost certain they'll find unreported/unknown vulnerabilities in. The result would likely be a lot more 0-day attacks involving ffmpeg, which I do not think anyone regards as a good outcome (I would consider "Google publishes a bunch of vulnerabilities ffmpeg hasn't fixed so that everyone knows about them" to be a much preferable outcome, personally)

Now, you might consider that possibility fine - after all, the ffmpeg developers have no obligation to work on the project, and thus to e.g. fix any vulnerabilities in it. But if that's fine, then simply ignoring the reports Google currently makes is presumably also fine, no ?

replies(1): >>45896863 #
14. SR2Z ◴[] No.45893970{4}[source]
That's not a choice. You can decide if Google files bugs like this or not, you can't force them to fix them.
replies(1): >>45898598 #
15. mister_mort ◴[] No.45894516{3}[source]
FFMPEG does autodetection of what is inside a file, the extension doesn't really matter. So it's trivial to construct a video file that's labelled .mp4 but is really using the vulnerable codec and triggers its payload upon playing it. (Given ffmpeg is also used to generate thumbnails in Windows if installed, IIRC, just having a trapped video file in a directory could be dangerous.)
16. tpmoney ◴[] No.45895452[source]
So no one should file public bug reports for open source software?
17. tpmoney ◴[] No.45895472{4}[source]
So to be clear, if Google doesn't include patches, you would rather they don't make bugs they find in software public so other people can fix them?

That is, you'd rather a world where Google either does know about a vulnerability and refuses to tell anyone, or just doesn't look for them at all, over a world where google looks for them and lets people know they exist, but doesn't submit their own fix for it.

Why do you want that world? Why do you want corporations to reduce the already meager amounts of work and resources they put into open source software even further?

18. saagarjha ◴[] No.45895841[source]
This is standard practice for Linux as well.
19. saagarjha ◴[] No.45895857[source]
> Why is Google deliberately running an AI process to find these bugs if they're just going to dump them all on the FFmpeg team to fix?

This is called fuzzing and it has been standard practice for over a decade. Nobody has had any problem with it until FFmpeg decided they didn’t like that AI filed a report against them and applied the (again, mostly standard at this point) disclosure deadline. FWIW, nobody would have likely cared except they went on their Twitter to complain, so now everyone has an opinion on it.

20. kragen ◴[] No.45896023{4}[source]
If widely deployed infrastructure software is so full of vulnerabilities that its maintainers can't fix them as fast as they're found, maybe it shouldn't be widely deployed, or they shouldn't be its maintainers. Disabling codecs in the default build that haven't been used in 30 years might be a good move, for example.

Either way, users need to know about the vulnerabilities. That way, they can make an informed tradeoff between, for example, disabling the LucasArts Smush codec in their copy of ffmpeg, and being vulnerable to this hole (and probably many others like it).

replies(1): >>45896250 #
21. ekidd ◴[] No.45896220{3}[source]
The actual real alternative is that the ffmpeg maintainers quit, just like the libxml2 maintainer did.

A lot of these core pieces of infrastructure are maintained by one to three middle-aged engineers in their free time, for nothing. Meanwhile, billion dollar companies use the software everywhere, and often give nothing back except bug reports and occasional license violations.

I mean, I love "responsible disclosure." But the only result of billion dollar corporations drowning a couple of unpaid engineers in bug reports is that the engineers will walk away and leave the code 100% unmaintained.

And yeah, part of the problem here is that C-based data parsers and codecs are almost always horrendously insecure. We could rewrite it all in Rust (and I have in fact rewritten one obscure codec in Rust) or WUFFS. But again, who's going to pay for that?

replies(1): >>45901356 #
22. ekidd ◴[] No.45896250{5}[source]
> they shouldn't be its maintainers.

I mean, yes, the ffmpeg maintainers are very likely to decide this on their own, abandoning the project entirely. This is already happening for quite a few core open source projects that are used by multiple billion-dollar companies and deployed to billions of users.

A lot of the projects probably should be retired and rewritten in safer system languages. But rewriting all of the widely-used projects suffering from these issues would likely cost hundreds of millions of dollars.

The alternative is that maybe some of the billion-dollar companies start making lists of all the software they ship to billions of users, and hire some paid maintainers through the Linux or Apache Foundations.

replies(1): >>45896747 #
23. chii ◴[] No.45896725{3}[source]
why are the standards and expectation different for google vs an independent researcher? Just because they are richer, doesn't mean they should be held to a standard that isn't done for an independent researcher.

The OSS maintainer has the responsibility to either fix, or declare they won't fix - both are appropriate actions, and they are free to make this choice. The consumer of OSS should have the right to know what vulns/issues exist in the package, so that they make as informed a decision as they can (such as adding defense in depth for vulns that the maintainers chooses not to fix).

replies(1): >>45897912 #
24. chii ◴[] No.45896747{6}[source]
> abandoning the project entirely

that is a good outcome, because then the people dependent on such a project would find it plausible to pay a new set of maintainers.

replies(1): >>45899435 #
25. kant2002 ◴[] No.45896863{5}[source]
I really don’t understand whole discourse us vs them? Why it is should be only Google fixing the bugs. Isn’t if volunteers not enough, so maybe more volunteers can step up and help FFMpeg. Via direct patches, or via directly lobbying companies to fund project.

In my opinion if the problem is money, and they cannot raise enough, then somebody should help them with that. Isn’t it?

26. om2 ◴[] No.45897036[source]
The codec is compiled in, enabled by default, and auto detected through file magic, so the fact that it is an obscure 1990s hobby codec does not in any way make the vulnerability less exploitable. At this point I think FFmpeg is being intentionally deceptive by constantly mentioning only the ancient obscure hobby status and not the fact that it’s on by default and autodetected. They have also rejected suggestions to turn obscure hobby codecs off by default, giving more priority to their goal of playing every media format ever than to security.
27. om2 ◴[] No.45897059[source]
> They also have the option to not spend resources finding the bugs in the first place.

The Copenhagen interpretation of security bugs: if you don’t look for it, it doesn’t exist and is not a problem.

28. samus ◴[] No.45897912{4}[source]
They are different because the independent researchers don't make money off the projects that they investigate.
replies(2): >>45898251 #>>45898592 #
29. walletdrainer ◴[] No.45898153[source]
> CVE-searching teams

Silly nitpick, but you search for vulnerabilities not CVEs. CVE is something that may or may not be assigned to track a vulnerability after it has been discovered.

Most security issues probably get patched without a CVE ever being issued.

30. walletdrainer ◴[] No.45898198{4}[source]
> But FFmpeg does not have the resources to fix these at the speed Google is finding them.

Google submitting a patch does not address this issue. The main work for maintainers here is making the decision whether or not they want to disable this codec, whether or not Google submits a patch to do that is completely immaterial.

31. Dylan16807 ◴[] No.45898251{5}[source]
Google makes money off ffmpeg in general but not this part of the code. They're not getting someone else to write a patch that helps them make money, because google will just disable this codec if it wasn't already disabled in their builds.

Also in general Google does investigate software they don't make money off.

replies(1): >>45901118 #
32. chii ◴[] No.45898592{5}[source]
> independent researchers don't make money off the projects that they investigate

but they make money off the reputational increase they earn for having their name attached to the investigation. Unless the investigation and report is anonymous and their name not attached (which, could be true for some researchers), i can say that they're not doing charity.

replies(1): >>45901007 #
33. klez ◴[] No.45898598{5}[source]
I can't force them. Doesn't mean I can't criticize them for not doing it.
34. kragen ◴[] No.45899435{7}[source]
We'll see. Video codec experts won't materialize out of thin air just because there's money.
35. samus ◴[] No.45901007{6}[source]
That's a one-time bonus they get for discovering a bug, not from using the project on production. Google also gets this reward by the way. Therefore it's still imbalanced.
36. samus ◴[] No.45901118{6}[source]
> Also in general Google does investigate software they don't make money off.

An organization of this size might actually have trouble making sure they really don't use code from that project. Or won't do so in the future.

37. SAI_Peregrinus ◴[] No.45901193[source]
You disclose so that users can decide what mitigations to take. If there's a way to mitigate the issue without a fix from the developers the users deserve to know. Whether the developers have any obligation to fix the problem is up to the license of the software, the 90 day concession is to allow those developers who are obligated or just want to issue fixes to do so before details are released.
38. SAI_Peregrinus ◴[] No.45901356{4}[source]
The other alternative if the ffmpeg developers change the text on their "about" screen from "Security is a high priority and code review is always done with security in mind. Though due to the very large amounts of code touching untrusted data security issues are unavoidable and thus we provide as quick as possible updates to our last stable releases when new security issues are found." to something like "Security is a best-effort priority. Code review is always done with security in mind. Due to the very large amounts of code touching untrusted data security issues are unavoidable. We attempt to provide updates to our last stable releases when new security issues are found, but make no guarantees as to how long this may take. Priority will be given to reports including a proof-of-concept exploit and a patch that fixes the security bug."

Then point to the "PoC + Patch or GTFO" sign when reports come in. If you use a library with a "NO WARRANTY" license clause in an application where you're responsible for failures, it's on you to fix or mitigate the issues, not on the library authors.