Most active commenters
  • radicality(4)

←back to thread

276 points samwillis | 14 comments | | HN request time: 1.214s | source | bottom
1. radicality ◴[] No.41082131[source]
Kinda related, but does someone maybe have a good set of links to help understand what HDR actually is? Whenever I tried in the past, I always got lost and none of it was intuitive.

There’s so many concepts there like: color spaces, transfer functions, HDR vs Apple’s XDR HDR, HLG vs Dolby Vision, mastering displays, max brightness vs peak brightness, all the different hdr monitor certification levels, 8 bit vs 10bit, “full” vs “video” levels when recording video etc etc.

Example use case - I want to play iPhone-recorded videos using mpv on my MacBook. There’s hundreds of knobs to set, and while I can muck around with them and get it looking close-ish to what playing the file in QuickTime/Finder, I still have no idea what any of these settings are doing.

replies(4): >>41082239 #>>41084195 #>>41084717 #>>41085674 #
2. wongarsu ◴[] No.41082239[source]
HDR is whatever marketing wants it to be.

Originally it's just about being able to show both really dark and really bright colors. Something that's really easy if each pixel is an individual LED, but that's very hard in LCD monitors with one big backlight and pixels are just dimmable filters for that backlight. Or alternatively on the sensor side the ability to capture really bright and really dark spots in the same shot, something our sensors are much worse at than our eyes, but you can pull some tricks.

Once you have that ability you notice that 8 bits of brightness information isn't that much. So you go with 10 bit or 16 bits. Your gamma settings also play a role (the thing that turns your linear color values into exponential values).

And of course the people who care about HDR have a big overlap with people who care about colors, so that's where your color spaces, certifying and calibrating monitors to match those color spaces etc comes in. It's really adjacent but often just rolled in for convenience.

replies(2): >>41082411 #>>41088505 #
3. radicality ◴[] No.41082411[source]
More bits to store more color/brightness etc makes sense.

I think my main confusion has usually been that it all feels like some kind of a… hack? Suppose I set my macbook screen to max brightness, and then open up a normal “white” png. Looks fine, and you would think “well, the display is at max brightness, and the png is filled with white”, so a fair conclusion would be thats the whitest/brightest that screen goes. But then you open another png but of a more special “whiter white”, and suddenly you see your screen actually can go brighter! So you get thoughts like “why is this white brighter”, “how do I trigger it”, “what are the actual limits of my screen”, “is this all some separate hacky code path”, “how come I only see it in images/videos, and not UI elements”, “is it possible to make a native Mac ui with that brightness”.

In any case, thanks for the answer. I might be overthinking it and there’s probably lots of historical/legacy reasons for the way things are with hdr.

replies(5): >>41082551 #>>41082592 #>>41082701 #>>41083745 #>>41084278 #
4. wongarsu ◴[] No.41082551{3}[source]
> there’s probably lots of historical/legacy reasons for the way things are with hdr

That's pretty much it. If you use a HDR TV it will usually work like you describe. It would display the same white for a normal white PNG and an "even whiter" "HDR" PNG.

Apple's decision makes sense if you imagine SDR (so not-HDR) images as HDR images clipped to to some SDR range in the middle of the HDR range (leading to lots of over- and underexposure in the SDR image). If you then show them side-by-side of course the whitest white in the HDR range is whiter than the whitest white in the SDR image. Of course that's a crude simplification of how images work, but it makes for a great demo: HDR images really pop and look visually better. If you stretched everything to the same brightness range the HDR images wouldn't be nearly as impressive, just more detail and less color banding. The marketing people wouldn't like that

5. crazygringo ◴[] No.41082592{3}[source]
It is all extremely hacky.

Because HDR allows us to encode brightnesses that virtually no consumer displays can display.

And so deciding how to display those on any given display, on a given OS, in a given app, is making whatever "hacky" and totally non-standardized tradeoffs the display+OS+app decide to make. And they're all different.

It's a complete mess. I'm strongly of the opinion that HDR made a fundamental mistake in trying to design for "ideal" hardware that nobody has, and then leaving "degraded" operation to be implementation-specific.

It's a complete design failure that playing HDR content on different apps/devices results in output that is often too dark and often has a telltale green tint. It's ironic that in practice, something meant to enable higher brightness and greater color accuracy has resulted in darker images and color that varies from slightly wrong to totally wrong.

6. duskwuff ◴[] No.41082701{3}[source]
> So you get thoughts like [...] “what are the actual limits of my screen” [...]

Some of the limitations, at least in Apple's displays, are thermal! The backlight cannot run at full brightness continuously across the full display; it can only hit its peak brightness (1600 nits) in a small area, or for a short time.

7. suzumer ◴[] No.41083745{3}[source]
While one commenter had it somewhat right that HDR has to do with how bright/dark an image can be, the main thing HDR images specify is how far ABOVE reference white you can display. With srgb, 100 percent of all channels is 100 percent white (brightness of a perfect lambertian reflector). Rec 2100 together with rec 2408 specify modern hdr encoding, where 203 nits is 100 percebt white, and above that would be anything brighter (light sources, specular reflection, etc). So if a white image encoded in sdr looks dimmer than hdr for non specular detail, that is probably encoding or decoding error.
8. ttoinou ◴[] No.41084195[source]
You can start by understanding the physics behind high dynamic range. Any real world analog value can have a tremendous dynamic range, it’s not just light : distances, sound, weight, time, frequencies etc. We always need to reduce / compress / limit / saturate dynamic range when converting to digital values. And we always need to expand it back when reconverting to an analog signal
9. dahart ◴[] No.41084278{3}[source]
One motivation for HDR is having absolute physical units, such as luminance in candelas per square meter. You can imagine that might be a floating point value and that 8 bits per channel might not be enough.

The problem you’re describing is that color brightness is relative, but if you want physical units and you have a calibrated display then adjusting your brightness is not allowed, because it would break the calibration.

Another reason for HDR is to allow you to change the “exposure” of an image. Imagine you take a photo of the sun with a camera. It clips to white. Most of the time even the whole sky clips to white, and clouds too. With a film camera, once the film is exposed, that’s it. You can’t see the sun or clouds because they got clamped to white. But what if you had a special camera that could see any color value, bright or dark, and you could decide later which parts are white and which are black. That’s what HDR gives you - lots of range, and it’s not necessarily all meant to be visible.

In computer graphics, this is useful for the same reason - if you render something with path tracing, you don’t want to expose it and throw away information that happens to get clamped to white. You want to save out the physical units and then simulate the exposure part later, so you don’t have to re-render.

So that’s all to say- the concept of HDR isn’t hacky at all, it’s closer to physics, but that can make it a bit harder to use and understand. Others have pointed out that productized HDR can be a confusing array of marketing mumbo jumbo, and that’s true, but not because HDR is messed up, that’s just a thing companies tend to do to consumers when dealing with science and technology.

I was introduced to HDR image formats in college while studying physically based rendering, and the first HDR image format I remember was Greg Ward’s .hdr format that is clever- 8 bits mantissa per channel and an 8 bit shared exponent, because if, say, green is way brighter than the other channels, you can’t see the dark detail in red & blue.

10. ◴[] No.41084717[source]
11. jasomill ◴[] No.41085674[source]
A summary, with pictures:

https://cdn.theasc.com/curtis-clark-white-paper-on-hdr-asc.p...

To better understand the "knobs", consider opening up your iPhone videos in DaVinci Resolve[1] and playing around with the scopes and tools in the color panel.

[1] https://www.blackmagicdesign.com/products/davinciresolve/tra...

replies(1): >>41104234 #
12. bscphil ◴[] No.41088505[source]
> Originally it's just about being able to show both really dark and really bright colors.

Sort of. The primary significance of HDR is the ability to specify absolute luminance.

Good quality SDR displays were already very bright and already had high native dynamic range (a large ratio between the brightest white and darkest black). The issue was that the media specifications did not control the brightness in any way. So a 1.0 luminance pixel was just whatever the brightest value the display could show (usually tuned by a brightness setting). And a 0.0 luminance pixel was just the minimum brightness the display could show (unfortunately, usually also affected by the brightness setting thanks to backlighting).

What HDR fundamentally changes is not the brightness of displays or their dynamic range (some HDR displays are worse than some older SDR displays when it comes to dynamic range), but the fact that HDR media has absolute luminance. This means that creators can now make highlights (stars, explosions) close to the peak brightness of a bright display, while diffuse whites are now dimmer.

Prior to HDR, a good bright display was just a calibrated display with its brightness incorrectly turned up too high, making everything bright. With HDR a good bright display is a display calibrated with correct brightness and the ability to show highlights and saturated colors with much more power than a normal white.

You're right about higher bit depths, though. Because HDR media describes a much wider dynamic range on a properly calibrated display (though not necessarily on a typical over-bright SDR display), it has a different gamma curve to allocate bits more appropriately, and is typically without any banding in only 10 or 12 bits. https://en.wikipedia.org/wiki/Perceptual_quantizer

replies(1): >>41104124 #
13. radicality ◴[] No.41104124{3}[source]
Thanks for this, the part about “absolute luminance” is pretty helpful!

So what you’re saying is that with hdr, whoever creates the video, adds to it absolute brightness levels for some/(all?) pixels, and then an HDR-capable screen would be a screen that has an api like ‘draw_pixel(x,y,color,absolute_luminance_in_nits)’ whereas an SDR screen would instead have ‘draw_pixel(x,y,color,relative_luminance)’ with relative_luminance in range [0,1] ?

I can imagine how the monitor brightness control would work in SDR, but what about HDR? If I lower the screen brightness, and the screen is asked to draw a pixel at 900 nits, what happens? For a real use-case, suppose I want to watch an hdr movie on my MacBook - should I switch the display mode to “HDR Video (P3-ST 2084)” which is one of the presets that locks the display to a specific brightness?

14. radicality ◴[] No.41104234[source]
Thanks! Gonna read that doc.

This is actually how I ended up in this rabbit hole :) I wanted to learn Resolve to edit iPhone HDR videos, got Davinci, played around, realized apparently iPhone uses Dolby Vision for hdr for which you need to buy Studio, so I bought Studio… My first goal was to import a short video, have it look correct in Davinci, then try to do a ‘no-op’ render and view the before/after videos side-by-side in QuickTime. Whatever I did the render always looked either bit different or completely different. But I’ll try again with all the new stuff just learned about hdr, thanks!