I would love for them to provide an option to view it with film simulation vs without.
One of my favorite movies of all time, The Holdovers, did film simulation extremely well. It's set in the '70s so it attempts to look like a movie of that era.
It looked great to me, but if you're an actual film nerd you're going to notice a lot of things aren't exactly accurate.
Maybe in the near future we'll see Netflix being able to process some post effects on the client. So if you're color blind, you get a mode for that. If you don't want fake grain you can turn it off.
That's true, but at a given bitrate (until you get to very high bitrates), the compressed original will usually look worse and less sharp because so many bits are spent trying to encode the original grain. As a result, that original grain tends to get "smeared" over larger areas, making it look muddy. You lose sharpness in areas of the actual scene because it's trying (and often failing) to encode sharp grains.
Film Grain Synthesis makes sense for streaming where bandwidth is limited, but I'll agree that in the examples, the synthesized grain doesn't look very grain-like. And, depending on the amount and method of denoising, it can definitely blur details from the scene.
I can see why they want to compare against the actual local copy of the video with the natural grain. But that’s the perfect copy that they can’t actually hope to match.
that's an understatement. it just looks like RGB noise effect was added. film grain does not look like RGB noise. to me, film grain is only one part of what gave film the film look. the way the highlights bloom rather than clip. it also was more natural/organic/some descriptive other than the ultrasharp of modern digital acquisition. using some SoftFX or Black Mist type filters help, but it's just not the same as it is a digital vs analog type of acquisition. all of these attempts at making something look like it's not just keep falling down in the same ways. but hey, there's a cool tech blog about it this time. film grain filters have been around for a long time, yet people just don't care for them. even in Blu-ray time frame, there was attempts at removing the grain in the encode and applying it in playback. Netflix isn't coming up with anything new, and apparently nothing exciting either based on the results.
What parameters would that be? Make it look like Eastman Ektachrome High-Speed Daylight Film 7251 400D? For years, people have taken film negative onto telecines and created content of grain to be used as overlays. For years, colorists have come up with ways of simulating the color of specific film stocks by using reference film with test patterns that's been made available.
If a director/producer wants film grain added to their digital content, that's where it should be done in post. Not by some devs working for a streaming platform. The use of grain or not is a creative decision made by the creators of the work. That's where it should remain
But still, they have:
> A source video frame from They Cloned Tyrone
> Regular AV1 (without FGS) @ 8274 kbps
> AV1 with FGS @ 2804 kbps
Just to emphasize the problem, would it be nice to see:
Regular AV1 (without FGS) @ 2804 kbps
It should look really bad, right? Which would emphasize their results.
Why? If you're spending a significant chunk of your bits just transmitting data that could be effectively recreated on the client for free, isn't that wasteful? Sure, maybe the grains wouldn't be at the exact same coordinates, but it's not like the director purposefully placed each grain in the first place.
I recognize that the locally-produced grain doesn't look quite right at the moment, but travel down the hypothetical with me for a moment. If you could make this work, why wouldn't you?
--------
...and yes, I acknowledge that once the grain is being added client side, the next logical step would be "well, we might as well let viewers turn it off." But, once we've established that client-side grain makes sense, what are you going to do about people having preferences? Should we outlaw de-noising video filters too?
I agree that the default setting should always match what the film maker intended—let's not end up with a TV motion smoothing situation, please for the love of god—but if someone actively decides "I want to watch this without the grain for my own viewing experience"... okay? You do you.
...and I will further acknowledge that I would in fact be that person! I hate grain. I modded Cuphead to remove the grain and I can't buy the Switch version because I know it will have grain. I respect the artistic decision but I don't like it and I'm not hurting anyone.
A few things to note:
- still-frames are also a mediocre way to evaluate video quality.
- a theoretically perfect[1] noise-removal filter will always look less detailed than the original source, since your brain/eye system will invent more detail for a noisy image than for a blurry image.
1: By which I mean a filter that preserves 100% of the non-grain detail present, not one that magically recovers detail lost due to noise.
1. Denoise the master, then add AV1 FGS metadata to tell players how to reconstruct the noise in the master (which is what the blog post is about) to get back the original image the director saw and approved
2. Do nothing (which is what they were doing), and let some of the noise get blurred or erased by the quantization step of the encoding process, or worse, burn shittons of coding bits trying to describe the exact noise in the frame, which hurts visual quality of the things people actually look at
All of these imply changes to the image that the director decided on to get around the underlying fact that deliberately adding noise to an image is, from a signal processing perspective, really stupid. But if we are going to do it, we can at least ensure it happens as far down the chain as possible so that Netflix's encoding doesn't destroy the noise. That's the idea you responded to: have the production company deliver a master with FGS metadata instead of baked-in film grain.
I'm sorry your tech isn't good enough to recreate the original. That does not mean you get to change the original because your tech isn't up to the task. Update your task to better handle the original. That's like saying an image of the Starry Night doesn't retain the details, so we're going to smear the original to fit the tech better. No. Go fix the tech. And no, this is not fixing the tech. It is a band-aid to cover the flaws in the tech.
In theory though, I don't see any reason why client-side grain that looks identical to the real thing shouldn't be achievable, with massive bandwidth savings in the process.
It won't be, like, pixel-for-pixel identical, but that was why I said no director is placing individual grain specks anyway.
The market has spoken and it says that people want to watch movies even when they don't have access to a 35mm projector or a projector than can handle digital cinema packages, so nobody is seeing the original outside a theater.
Many viewers are bandwidth limited, so there's tradeoffs ... if this film grain stuff improves available picture quality at a given bandwidth, that's a win. IMHO, Netflix blogs about codec things seem to focus on bandwidth reduction, so I'm never sure if users with ample bandwidth end up getting less quality or not; that's a valid question to ask.
So as long they're analyzing the style of grain in the source properly, which the technical part of the post mentions they do...
Let's be clear. The alternative isn't "higher bandwidth" it's "aggressive denoising during stream encode". If the studio is adding grain in post then describing that as a set of parameters will result in a higher quality experience for the vast majority of those viewing it in this day and age.
I already normalize audio (modern US produced media is often atrocious), modify gamma and brightness, and probably some other stuff. Oh and it's not as though I'm viewing on a color calibrated monitor in the first place.
The purists can purchase lossless physical copies. The rest of us would benefit from such improvements.
> If a director/producer wants film grain added to their digital content, that's where it should be done in post.
To me, this philosophy seems like a patent waste of bandwidth.
Yes, a theater probably holds more people than your typical viewing experience at home. Unless you go to the movies during the week and avoid crowds. The last movie I saw at the theater was on a Tuesday after opening weekend as 4pm. There might have been 2 other people in the entire theater. It was amazing.