What parameters would that be? Make it look like Eastman Ektachrome High-Speed Daylight Film 7251 400D? For years, people have taken film negative onto telecines and created content of grain to be used as overlays. For years, colorists have come up with ways of simulating the color of specific film stocks by using reference film with test patterns that's been made available.
If a director/producer wants film grain added to their digital content, that's where it should be done in post. Not by some devs working for a streaming platform. The use of grain or not is a creative decision made by the creators of the work. That's where it should remain
Why? If you're spending a significant chunk of your bits just transmitting data that could be effectively recreated on the client for free, isn't that wasteful? Sure, maybe the grains wouldn't be at the exact same coordinates, but it's not like the director purposefully placed each grain in the first place.
I recognize that the locally-produced grain doesn't look quite right at the moment, but travel down the hypothetical with me for a moment. If you could make this work, why wouldn't you?
--------
...and yes, I acknowledge that once the grain is being added client side, the next logical step would be "well, we might as well let viewers turn it off." But, once we've established that client-side grain makes sense, what are you going to do about people having preferences? Should we outlaw de-noising video filters too?
I agree that the default setting should always match what the film maker intended—let's not end up with a TV motion smoothing situation, please for the love of god—but if someone actively decides "I want to watch this without the grain for my own viewing experience"... okay? You do you.
...and I will further acknowledge that I would in fact be that person! I hate grain. I modded Cuphead to remove the grain and I can't buy the Switch version because I know it will have grain. I respect the artistic decision but I don't like it and I'm not hurting anyone.
1. Denoise the master, then add AV1 FGS metadata to tell players how to reconstruct the noise in the master (which is what the blog post is about) to get back the original image the director saw and approved
2. Do nothing (which is what they were doing), and let some of the noise get blurred or erased by the quantization step of the encoding process, or worse, burn shittons of coding bits trying to describe the exact noise in the frame, which hurts visual quality of the things people actually look at
All of these imply changes to the image that the director decided on to get around the underlying fact that deliberately adding noise to an image is, from a signal processing perspective, really stupid. But if we are going to do it, we can at least ensure it happens as far down the chain as possible so that Netflix's encoding doesn't destroy the noise. That's the idea you responded to: have the production company deliver a master with FGS metadata instead of baked-in film grain.
I'm sorry your tech isn't good enough to recreate the original. That does not mean you get to change the original because your tech isn't up to the task. Update your task to better handle the original. That's like saying an image of the Starry Night doesn't retain the details, so we're going to smear the original to fit the tech better. No. Go fix the tech. And no, this is not fixing the tech. It is a band-aid to cover the flaws in the tech.
In theory though, I don't see any reason why client-side grain that looks identical to the real thing shouldn't be achievable, with massive bandwidth savings in the process.
It won't be, like, pixel-for-pixel identical, but that was why I said no director is placing individual grain specks anyway.
The market has spoken and it says that people want to watch movies even when they don't have access to a 35mm projector or a projector than can handle digital cinema packages, so nobody is seeing the original outside a theater.
Many viewers are bandwidth limited, so there's tradeoffs ... if this film grain stuff improves available picture quality at a given bandwidth, that's a win. IMHO, Netflix blogs about codec things seem to focus on bandwidth reduction, so I'm never sure if users with ample bandwidth end up getting less quality or not; that's a valid question to ask.
Let's be clear. The alternative isn't "higher bandwidth" it's "aggressive denoising during stream encode". If the studio is adding grain in post then describing that as a set of parameters will result in a higher quality experience for the vast majority of those viewing it in this day and age.
I already normalize audio (modern US produced media is often atrocious), modify gamma and brightness, and probably some other stuff. Oh and it's not as though I'm viewing on a color calibrated monitor in the first place.
The purists can purchase lossless physical copies. The rest of us would benefit from such improvements.
> If a director/producer wants film grain added to their digital content, that's where it should be done in post.
To me, this philosophy seems like a patent waste of bandwidth.
Yes, a theater probably holds more people than your typical viewing experience at home. Unless you go to the movies during the week and avoid crowds. The last movie I saw at the theater was on a Tuesday after opening weekend as 4pm. There might have been 2 other people in the entire theater. It was amazing.