←back to thread

What is HDR, anyway?

(www.lux.camera)
791 points _kush | 4 comments | | HN request time: 0.619s | source
Show context
dahart ◴[] No.43986653[source]
It seems like a mistake to lump HDR capture, HDR formats and HDR display together, these are very different things. The claim that Ansel Adams used HDR is super likely to cause confusion, and isn’t particularly accurate.

We’ve had HDR formats and HDR capture and edit workflows since long before HDR displays. The big benefit of HDR capture & formats is that your “negative” doesn’t clip super bright colors and doesn’t lose color resolution in super dark color. As a photographer, with HDR you can re-expose the image when you display/print it, where previously that wasn’t possible. Previously when you took a photo, if you over-exposed it or under-exposed it, you were stuck with what you got. Capturing HDR gives the photographer one degree of extra freedom, allowing them to adjust exposure after the fact. Ansel Adams wasn’t using HDR in the same sense we’re talking about, he was just really good at capturing the right exposure for his medium without needing to adjust it later. There is a very valid argument to be made for doing the work up-front to capture what you’re after, but ignoring that for a moment, it is simply not possible to re-expose Adams’ negatives to reveal color detail he didn’t capture. That’s why he’s not using HDR, and why saying he is will only further muddy the water.

replies(10): >>43986960 #>>43986994 #>>43987319 #>>43987388 #>>43987923 #>>43988060 #>>43988406 #>>43990585 #>>43991525 #>>43992834 #
arghwhat ◴[] No.43987319[source]
Arguably, even considering HDR a distinct thing is itself weird an inaccurate.

All mediums have a range, and they've never all matched. Sometimes we've tried to calibrate things to match, but anyone watching SDR content for the past many years probably didn't do so on a color-calibrated and brightness calibrated screen - that wouldn't allow you to have a brightness slider.

HDR on monitors is about communicating content brightness and monitor capabilities, but then you have the question of whether to clip the highlights or just map the range when the content is mastered for 4000 nits but your monitor manages 1000-1500 and only in a small window.

replies(4): >>43987784 #>>43989470 #>>43989947 #>>43992895 #
1. dahart ◴[] No.43987784[source]
This! Yes I think you’re absolutely right. The term “HDR” is in part kind of an artifact of how digital image formats evolved, and it kind of only makes sense relative to a time when the most popular image formats and most common displays were not very sophisticated about colors.

That said, there is one important part that is often lost. One of the ideas behind HDR, sometimes, is to capture absolute values in physical units, rather than relative brightness. This is the distinguishing factor that film and paper and TVs don’t have. Some new displays are getting absolute brightness features, but historically most media display relative color values.

replies(1): >>43989246 #
2. arghwhat ◴[] No.43989246[source]
Absolute is also a funny size. From the perspective of human visual perception, an absolute brightness only matters if the entire viewing environment is also controlled to the same absolute values. Visual perception is highly contextual, and we are not only seeing the screen.

It's not fun being unable to watch dark scenes during the day or evening in a living room, nor is vaporizing your retinas if the ambient environment went dark in the meantime. People want good viewing experience in the available environment that is logically similar to what the content intended, but that is not always the same as reproducing the exact same photons as the directors's mastering monitor sent towards their their eyeballs at the time of production.

replies(2): >>43989413 #>>43989575 #
3. dahart ◴[] No.43989413[source]
Yep, absolutely! ;)

This brings up a bunch of good points, and it tracks with what I was trying to say about conflating HDR processing with HDR display. But do keep in mind that even when you have absolute value images, that doesn’t imply anything about how you display them. You can experience large benefits with an HDR workflow, even when your output or display is low dynamic range. Assume that there will be some tone mapping process happening and that the way you map tones depends on the display medium and its capabilities, and on the context and environment of the display. Using the term “HDR” shouldn’t imply any mismatch or disconnect in the viewing environment. It only did so in the article because it wasn’t very careful about its terms and definitions.

4. tshaddox ◴[] No.43989575[source]
Indeed. For a movie scene depicting the sky including the Sun, you probably wouldn't want your TV to achieve the same brightness as the Sun. You might want your TV to become significantly brighter than the rest of the scenes, to achieve an effect something like the Sun catching your eye.

Of course, the same thing goes for audio in movies. You probably want a gunshot or explosion to sound loud and even be slightly shocking, but you probably don't want it to be as loud as a real gunshot or explosion would be from the depicted distance.

The difference is that for 3+ decades the dynamic range of ubiquitous audio formats (like 16 bit PCM in audio CDs and DVDs) has provided far more dynamic range than is comfortably usable in normal listening environments. So we're very familiar with audio being mastered with a much smaller dynamic range than the medium supports.