←back to thread

What is HDR, anyway?

(www.lux.camera)
789 points _kush | 4 comments | | HN request time: 0.899s | source
Show context
mxfh ◴[] No.43984652[source]
Does anyone else find the hubris in the first paragraph writing as off-putting as I do?

"we finally explain what HDR actually means"

Then spends 2/3rds of the article on a tone mapping expedition, only to not address the elephant in the room, that is the almost complete absence of predictable color management in consumer-grade digital environments.

UIs are hardly ever tested in HDR: I don't want my subtitles to burn out my eyes in actual HDR display.

It is here, where you, the consumer, are as vulnerable to light in a proper dark environment for movie watching, as when raising the window curtains on a bright summer morning. (That brightness abuse by content is actually discussed here)

Dolby Vision and Apple have the lead here as a closed platforms, on the web it's simply not predictably possible yet.

Best hope is the efforts of the Color on the Web Community Group from my impression.

https://github.com/w3c/ColorWeb-CG

replies(9): >>43984882 #>>43985074 #>>43985248 #>>43985299 #>>43985887 #>>43986100 #>>43986356 #>>43987821 #>>43987913 #
sandofsky ◴[] No.43985299[source]
> Does anyone else find the hubris in the first paragraph writing as off-putting as I do? > "we finally explain what HDR actually means"

No. Because it's written for the many casual photographers we've spoken with who are confused and asked for an explainer.

> Then spends 2/3rds of the article on a tone mapping expedition, only to not address the elephant in the room, that is the almost complete absence of predictable color management in consumer-grade digital environments.

That's because this post is about HDR and not color management, which is different topic.

replies(3): >>43985827 #>>43986813 #>>43987398 #
mxfh ◴[] No.43986813[source]
Maybe my response was part of the broader HDR symptom—that the acronym is overloaded with different meanings depending on where you're coming from.

On the HN frontpage, people are likely thinking of one of at least three things:

HDR as display tech (hardware)

HDR as wide gamut data format (content)

HDR as tone mapping (processing)

...

So when the first paragraph says we finally explain what HDR actually means, it set me off on the wrong foot—it comes across pretty strongly for a term that’s notoriously context-dependent. Especially in a blog post that reads like a general explainer rather than a direct Q&A response when not coming through your apps channels.

Then followed up by The first HDR is the "HDR mode" introduced to the iPhone camera in 2010. caused me to write the comment.

For people over 35 with even the faintest interest in photography, the first exposure to the HDR acronym probably didn’t arrive with the iPhone in 2010, but HDR IS equivalent to Photomatix style tone mapping starting in 2005 as even mentioned later. The ambiguity of the term is a given now. I think it's futile to insist or police one meaning other the other in non-scientific informal communication, just use more specific terminology.

So the correlation of what HDR means or what sentiment it evokes in people by age group and self-assesed photography skill might be something worthwhile to explore.

The post get's a lot better after that. That said, I really did enjoy the depth. The dive into the classic dodge and burn and the linked YouTube piece. One explainer at a time makes sense—and tone mapping is a good place to start. Even tone mapping is fine in moderation :)

replies(2): >>43987243 #>>43987869 #
ddingus ◴[] No.43987243[source]
I took the post about the same way. Thought it excellent because of depth.

Often, we don't get that and this topic, plus my relative ignorance on it, welcomed the post as written.

replies(1): >>43987666 #
1. mxfh ◴[] No.43987666[source]
Just out of curiosity since your profile suggests your from an older cohort. Do you actively remember the Pixelmatix tone mapping era, or where you already old enough to see this as a passing fad, or was this a more niche thing than I remember?

Now I even remember the 2005 HDR HL2 Lost Coast Demo was a thing 20 years ago: https://bit-tech.net/previews/gaming/pc/hl2_hdr_overview/1/

replies(1): >>43998048 #
2. ddingus ◴[] No.43998048[source]
I was old enough to see it as the passing fad it was.

Niche, style points first kind of thing for sure.

Meta: old enough that getting either a new color not intended, or an additional one visible on screen and having the machine remain able to perform was a big deal.

replies(1): >>43998929 #
3. mxfh ◴[] No.43998929[source]
I missed the MDA/EGA/CGA/Hercules era and jumped right into glorious VGA. Only start options for some DOS-games informed you about that drama in the mid 90s not having any idea what that meant otherwise.
replies(1): >>44000341 #
4. ddingus ◴[] No.44000341{3}[source]
It is a fun era NOW. I love the pre VGA PC and earlier systems, like Apple 2, Atari. Am building a CGA system with a K2 CPU to go hacking on to see what was possible. I have, as do many, unfinished business :)

Back then, it was fun at times, bit was also limiting in ways sometimes hard to fathom ways.

Things are crazy good now, BTW. Almost anything is a few clicks away. The CRT is old, panels so damn good..