←back to thread

276 points samwillis | 1 comments | | HN request time: 0s | source
Show context
radicality ◴[] No.41082131[source]
Kinda related, but does someone maybe have a good set of links to help understand what HDR actually is? Whenever I tried in the past, I always got lost and none of it was intuitive.

There’s so many concepts there like: color spaces, transfer functions, HDR vs Apple’s XDR HDR, HLG vs Dolby Vision, mastering displays, max brightness vs peak brightness, all the different hdr monitor certification levels, 8 bit vs 10bit, “full” vs “video” levels when recording video etc etc.

Example use case - I want to play iPhone-recorded videos using mpv on my MacBook. There’s hundreds of knobs to set, and while I can muck around with them and get it looking close-ish to what playing the file in QuickTime/Finder, I still have no idea what any of these settings are doing.

replies(4): >>41082239 #>>41084195 #>>41084717 #>>41085674 #
wongarsu ◴[] No.41082239[source]
HDR is whatever marketing wants it to be.

Originally it's just about being able to show both really dark and really bright colors. Something that's really easy if each pixel is an individual LED, but that's very hard in LCD monitors with one big backlight and pixels are just dimmable filters for that backlight. Or alternatively on the sensor side the ability to capture really bright and really dark spots in the same shot, something our sensors are much worse at than our eyes, but you can pull some tricks.

Once you have that ability you notice that 8 bits of brightness information isn't that much. So you go with 10 bit or 16 bits. Your gamma settings also play a role (the thing that turns your linear color values into exponential values).

And of course the people who care about HDR have a big overlap with people who care about colors, so that's where your color spaces, certifying and calibrating monitors to match those color spaces etc comes in. It's really adjacent but often just rolled in for convenience.

replies(2): >>41082411 #>>41088505 #
bscphil ◴[] No.41088505[source]
> Originally it's just about being able to show both really dark and really bright colors.

Sort of. The primary significance of HDR is the ability to specify absolute luminance.

Good quality SDR displays were already very bright and already had high native dynamic range (a large ratio between the brightest white and darkest black). The issue was that the media specifications did not control the brightness in any way. So a 1.0 luminance pixel was just whatever the brightest value the display could show (usually tuned by a brightness setting). And a 0.0 luminance pixel was just the minimum brightness the display could show (unfortunately, usually also affected by the brightness setting thanks to backlighting).

What HDR fundamentally changes is not the brightness of displays or their dynamic range (some HDR displays are worse than some older SDR displays when it comes to dynamic range), but the fact that HDR media has absolute luminance. This means that creators can now make highlights (stars, explosions) close to the peak brightness of a bright display, while diffuse whites are now dimmer.

Prior to HDR, a good bright display was just a calibrated display with its brightness incorrectly turned up too high, making everything bright. With HDR a good bright display is a display calibrated with correct brightness and the ability to show highlights and saturated colors with much more power than a normal white.

You're right about higher bit depths, though. Because HDR media describes a much wider dynamic range on a properly calibrated display (though not necessarily on a typical over-bright SDR display), it has a different gamma curve to allocate bits more appropriately, and is typically without any banding in only 10 or 12 bits. https://en.wikipedia.org/wiki/Perceptual_quantizer

replies(1): >>41104124 #
1. radicality ◴[] No.41104124[source]
Thanks for this, the part about “absolute luminance” is pretty helpful!

So what you’re saying is that with hdr, whoever creates the video, adds to it absolute brightness levels for some/(all?) pixels, and then an HDR-capable screen would be a screen that has an api like ‘draw_pixel(x,y,color,absolute_luminance_in_nits)’ whereas an SDR screen would instead have ‘draw_pixel(x,y,color,relative_luminance)’ with relative_luminance in range [0,1] ?

I can imagine how the monitor brightness control would work in SDR, but what about HDR? If I lower the screen brightness, and the screen is asked to draw a pixel at 900 nits, what happens? For a real use-case, suppose I want to watch an hdr movie on my MacBook - should I switch the display mode to “HDR Video (P3-ST 2084)” which is one of the presets that locks the display to a specific brightness?