←back to thread

What is HDR, anyway?

(www.lux.camera)
789 points _kush | 2 comments | | HN request time: 0.413s | source
Show context
dahart ◴[] No.43986653[source]
It seems like a mistake to lump HDR capture, HDR formats and HDR display together, these are very different things. The claim that Ansel Adams used HDR is super likely to cause confusion, and isn’t particularly accurate.

We’ve had HDR formats and HDR capture and edit workflows since long before HDR displays. The big benefit of HDR capture & formats is that your “negative” doesn’t clip super bright colors and doesn’t lose color resolution in super dark color. As a photographer, with HDR you can re-expose the image when you display/print it, where previously that wasn’t possible. Previously when you took a photo, if you over-exposed it or under-exposed it, you were stuck with what you got. Capturing HDR gives the photographer one degree of extra freedom, allowing them to adjust exposure after the fact. Ansel Adams wasn’t using HDR in the same sense we’re talking about, he was just really good at capturing the right exposure for his medium without needing to adjust it later. There is a very valid argument to be made for doing the work up-front to capture what you’re after, but ignoring that for a moment, it is simply not possible to re-expose Adams’ negatives to reveal color detail he didn’t capture. That’s why he’s not using HDR, and why saying he is will only further muddy the water.

replies(10): >>43986960 #>>43986994 #>>43987319 #>>43987388 #>>43987923 #>>43988060 #>>43988406 #>>43990585 #>>43991525 #>>43992834 #
munificent ◴[] No.43987388[source]
> The claim that Ansel Adams used HDR is super likely to cause confusion

That isn't what the article claims. It says:

"Ansel Adams, one of the most revered photographers of the 20th century, was a master at capturing dramatic, high dynamic range scenes."

"Use HDR" (your term) is vague to the point of not meaning much of anything, but the article is clear that Adams was capturing scenes that had a high dynamic range, which is objectively true.

replies(2): >>43987469 #>>43988137 #
dahart ◴[] No.43987469[source]
Literally the sentence preceding the one you quoted is “What if I told you that analog photographers captured HDR as far back as 1857?”.
replies(2): >>43987485 #>>43988031 #
zymhan ◴[] No.43987485[source]
And that quote specifically does not "lump HDR capture, HDR formats and HDR display together".

It is directly addressing capture.

replies(1): >>43987587 #
dahart ◴[] No.43987587[source]
Correct. I didn’t say that sentence was the source of the conflation, I said it was the source of the Ansel Adams problem. There are other parts that mix together capture, formats, and display.

Edit: and btw I am objecting to calling film capture “HDR”, I don’t think that helps define HDR nor reflects accurately on the history of the term.

replies(1): >>43988126 #
pavlov ◴[] No.43988126[source]
That’s a strange claim because the first digital HDR capture devices were film scanners (for example the Cineon equipment used by the motion picture industry in the 1990s).

Film provided a higher dynamic range than digital sensors, and professionals wanted to capture that for image editing.

Sure, it wasn’t terribly deep HDR by today’s standards. Cineon used 10 bits per channel with the white point at coding value 685 (and a log color space). That’s still a lot more range and superwhite latitude than you got with standard 8-bpc YUV video.

replies(1): >>43988238 #
dahart ◴[] No.43988238[source]
They didn’t call that “HDR” at the time, and it wasn’t based on the idea of recording radiance or other absolute physical units.

I’m certain physicists had high range digital cameras before Cineon, and they were working in absolute physical metrics. That would be a stronger example.

You bring up an important point that is completely lost in the HDR discussion: this is about color resolution at least as much as it’s about range, if not moreso. I can use 10 bits for a [0..1] range just as easily as I can use 4 bits to represent quantized values from 0 to 10^9. Talking about the range of a scene captured is leaving out most of the story, and all of the important parts. We’ve had outdoor photography, high quality films, and the ability to control exposure for a long time, and that doesn’t explain what “HDR” is.

replies(2): >>43988400 #>>43988428 #
1. pavlov ◴[] No.43988400[source]
It certainly was called HDR when those Cineon files were processed in a linear light workflow. And film was the only capture source available that could provide sufficient dynamic range, so IMO that makes it “HDR”.

But I agree that the term is such a wide umbrella that almost anything qualifies. Fifteen years ago you could do a bit of superwhite glows and tone mapping on 8-bpc and people called that look HDR.

replies(1): >>43988639 #
2. dahart ◴[] No.43988639[source]
Do you have any links from 1990ish that show use of “HDR”? I am interested in when “HDR” became a phrase people used. I believe I remember hearing it first around 1996 or 97, but it may have started earlier. It was certainly common by 2001. I don’t see that used as a term nor an acronym in the Cineon docs from 1995, but it does talk about log and linear spaces and limiting the dynamic range when converting. The Cineon scanner predates sRGB, and used gamma 1.7. https://dotcsw.com/doc/cineon1.pdf

This 10 bit scanner gave you headroom of like 30% above white. So yeah it qualifies as a type of high dynamic range when compared to 8 bit/channel RGB, but on the other hand, a range of [0 .. 1.3] isn’t exactly in the spirit of what “HDR” stands for. The term implicitly means a lot more than 1.0, not just a little. And again people developing HDR like Greg Ward and Paul Debevec were arguing for absolute units such as luminance, which the Cineon scanner does not do.