←back to thread

276 points samwillis | 1 comments | | HN request time: 0s | source
Show context
radicality ◴[] No.41082131[source]
Kinda related, but does someone maybe have a good set of links to help understand what HDR actually is? Whenever I tried in the past, I always got lost and none of it was intuitive.

There’s so many concepts there like: color spaces, transfer functions, HDR vs Apple’s XDR HDR, HLG vs Dolby Vision, mastering displays, max brightness vs peak brightness, all the different hdr monitor certification levels, 8 bit vs 10bit, “full” vs “video” levels when recording video etc etc.

Example use case - I want to play iPhone-recorded videos using mpv on my MacBook. There’s hundreds of knobs to set, and while I can muck around with them and get it looking close-ish to what playing the file in QuickTime/Finder, I still have no idea what any of these settings are doing.

replies(4): >>41082239 #>>41084195 #>>41084717 #>>41085674 #
wongarsu ◴[] No.41082239[source]
HDR is whatever marketing wants it to be.

Originally it's just about being able to show both really dark and really bright colors. Something that's really easy if each pixel is an individual LED, but that's very hard in LCD monitors with one big backlight and pixels are just dimmable filters for that backlight. Or alternatively on the sensor side the ability to capture really bright and really dark spots in the same shot, something our sensors are much worse at than our eyes, but you can pull some tricks.

Once you have that ability you notice that 8 bits of brightness information isn't that much. So you go with 10 bit or 16 bits. Your gamma settings also play a role (the thing that turns your linear color values into exponential values).

And of course the people who care about HDR have a big overlap with people who care about colors, so that's where your color spaces, certifying and calibrating monitors to match those color spaces etc comes in. It's really adjacent but often just rolled in for convenience.

replies(2): >>41082411 #>>41088505 #
radicality ◴[] No.41082411[source]
More bits to store more color/brightness etc makes sense.

I think my main confusion has usually been that it all feels like some kind of a… hack? Suppose I set my macbook screen to max brightness, and then open up a normal “white” png. Looks fine, and you would think “well, the display is at max brightness, and the png is filled with white”, so a fair conclusion would be thats the whitest/brightest that screen goes. But then you open another png but of a more special “whiter white”, and suddenly you see your screen actually can go brighter! So you get thoughts like “why is this white brighter”, “how do I trigger it”, “what are the actual limits of my screen”, “is this all some separate hacky code path”, “how come I only see it in images/videos, and not UI elements”, “is it possible to make a native Mac ui with that brightness”.

In any case, thanks for the answer. I might be overthinking it and there’s probably lots of historical/legacy reasons for the way things are with hdr.

replies(5): >>41082551 #>>41082592 #>>41082701 #>>41083745 #>>41084278 #
1. dahart ◴[] No.41084278{3}[source]
One motivation for HDR is having absolute physical units, such as luminance in candelas per square meter. You can imagine that might be a floating point value and that 8 bits per channel might not be enough.

The problem you’re describing is that color brightness is relative, but if you want physical units and you have a calibrated display then adjusting your brightness is not allowed, because it would break the calibration.

Another reason for HDR is to allow you to change the “exposure” of an image. Imagine you take a photo of the sun with a camera. It clips to white. Most of the time even the whole sky clips to white, and clouds too. With a film camera, once the film is exposed, that’s it. You can’t see the sun or clouds because they got clamped to white. But what if you had a special camera that could see any color value, bright or dark, and you could decide later which parts are white and which are black. That’s what HDR gives you - lots of range, and it’s not necessarily all meant to be visible.

In computer graphics, this is useful for the same reason - if you render something with path tracing, you don’t want to expose it and throw away information that happens to get clamped to white. You want to save out the physical units and then simulate the exposure part later, so you don’t have to re-render.

So that’s all to say- the concept of HDR isn’t hacky at all, it’s closer to physics, but that can make it a bit harder to use and understand. Others have pointed out that productized HDR can be a confusing array of marketing mumbo jumbo, and that’s true, but not because HDR is messed up, that’s just a thing companies tend to do to consumers when dealing with science and technology.

I was introduced to HDR image formats in college while studying physically based rendering, and the first HDR image format I remember was Greg Ward’s .hdr format that is clever- 8 bits mantissa per channel and an 8 bit shared exponent, because if, say, green is way brighter than the other channels, you can’t see the dark detail in red & blue.