As for tone mapping, I think the examples they show tend way too much towards flat low-local-contrast for my tastes.
As for tone mapping, I think the examples they show tend way too much towards flat low-local-contrast for my tastes.
The end result is a complete chaos. Every piece of the pipeline doing something wrong, and then the software tries to compensate for it by emitting doubly wrong data, without even having reliable information about what it needs to compensate for.
https://docs.google.com/document/d/1A__vvTDKXt4qcuCcSN-vLzcQ...
But HDR, it's a minefield of different display qualities, color spaces, standards. It's no wonder that nobody gets it right and everyone feels confused.
HDR on a display that has peak brightness of 2000 nits will look completely different than a display with 800 nits, and they both get to claim they are HDR.
We should have a standard equivalent to color spaces. Set, say, 2000 nits as 100% of HDR. Then a 2000 nit display gets to claim it's 100% HDR. A 800 nit display gets to claim 40% HDR, etc. A 2500 nit display could even use 125% HDR in it's marketing.
It's still not perfect - some displays (OLED) can only show peak brightness over a portion of the screen. But it would be an improvement.
Besides, HDR quality is more complex than just max nits, because it depends on viewing conditions and black levels (and everyone cheats with their contrast metrics).
OLEDs can peak at 600 nits and look awesome — in a pitch black room. LCD monitors could boost to 2000 nits and display white on grey.
We have sRGB kinda working for color primaries and gamma, but it's not the real sRGB at 80 nits. It ended up being relative instead of absolute.
A lot of the mess is caused by the need to adapt content mastered for pitch black cinema at 2000 nits to 800-1000 nits in daylight, which needs very careful processing to preserve highlights and saturation, but software can't rely on the display doing it properly, and doing it in software sends false signal and risks display correcting it twice.