←back to thread

What is HDR, anyway?

(www.lux.camera)
791 points _kush | 1 comments | | HN request time: 0.001s | source
Show context
dahart ◴[] No.43986653[source]
It seems like a mistake to lump HDR capture, HDR formats and HDR display together, these are very different things. The claim that Ansel Adams used HDR is super likely to cause confusion, and isn’t particularly accurate.

We’ve had HDR formats and HDR capture and edit workflows since long before HDR displays. The big benefit of HDR capture & formats is that your “negative” doesn’t clip super bright colors and doesn’t lose color resolution in super dark color. As a photographer, with HDR you can re-expose the image when you display/print it, where previously that wasn’t possible. Previously when you took a photo, if you over-exposed it or under-exposed it, you were stuck with what you got. Capturing HDR gives the photographer one degree of extra freedom, allowing them to adjust exposure after the fact. Ansel Adams wasn’t using HDR in the same sense we’re talking about, he was just really good at capturing the right exposure for his medium without needing to adjust it later. There is a very valid argument to be made for doing the work up-front to capture what you’re after, but ignoring that for a moment, it is simply not possible to re-expose Adams’ negatives to reveal color detail he didn’t capture. That’s why he’s not using HDR, and why saying he is will only further muddy the water.

replies(10): >>43986960 #>>43986994 #>>43987319 #>>43987388 #>>43987923 #>>43988060 #>>43988406 #>>43990585 #>>43991525 #>>43992834 #
arghwhat ◴[] No.43987319[source]
Arguably, even considering HDR a distinct thing is itself weird an inaccurate.

All mediums have a range, and they've never all matched. Sometimes we've tried to calibrate things to match, but anyone watching SDR content for the past many years probably didn't do so on a color-calibrated and brightness calibrated screen - that wouldn't allow you to have a brightness slider.

HDR on monitors is about communicating content brightness and monitor capabilities, but then you have the question of whether to clip the highlights or just map the range when the content is mastered for 4000 nits but your monitor manages 1000-1500 and only in a small window.

replies(4): >>43987784 #>>43989470 #>>43989947 #>>43992895 #
theshackleford ◴[] No.43989947[source]
> but your monitor manages 1000-1500 and only in a small window.

Owning a display that can do 1300+ nits sustained across a 100% window has been the biggest display upgrade I think I have ever had. It's given me a tolerance for LCD, a technology I've hated since the death of CRTs and turned me away from OLED.

There was a time I would have said i'd never own a non OLED display again. But a capable HDR display changed that logic in a big way.

Too bad the motion resolution on it, especially compared to OLED is meh. Again, at one point, motion was the most important aspect to me (its why I still own CRTs) but this level of HDR...transformative for lack of a better word.

replies(2): >>43990654 #>>43992760 #
arghwhat ◴[] No.43992760[source]
Motion resolution? Do you mean the pixel response time?

CRTs technically have quite a few artifacts in this area, but as content displayed CRTs tend to be built for CRTs this is less of an issue, and in many case even required. The input is expecting specific distortions and effects from scanlines and phosphor, which a "perfect" display wouldn't exhibit...

The aggressive OLED ABL is simply a thermal issue. It can be mitigated with thermal design in smaller devices, and anything that increases efficiency (be it micro lens arrays, stacked "tandem" panels, quantum dots, alternative emitter technology) will lower the thermal load and increase the max full panel brightness.

(LCD with zone dimming would also be able to pull this trick to get even brighter zones, but because the base brightness is high enough it doesn't bother.)

replies(1): >>43998984 #
1. theshackleford ◴[] No.43998984[source]
> Motion resolution? Do you mean the pixel response time?

I indeed meant motion resolution, which pixel response time only partially affects. It’s about how clearly a display shows motion, unlike static resolution which only reflects realistically a still image. Even with fast pixels, sample and hold displays blur motion unless framerate and refresh rate is high, or BFI/strobing is used. This blur immediately lowers perceived resolution the moment anything moves on screen.

> The input is expecting specific distortions and effects from scanlines and phosphor, which a "perfect" display wouldn't exhibit...

That's true for many CRT purists, but is not a huge deal for me personally. My focus is motion performance. If LCD/OLED matched CRT motion at the same refresh rate, I’d drop CRT in a heartbeat, slap on a CRT shader, and call it a day. Heresy to many CRT enthusiasts.

Ironically, this is an area in which I feel we are getting CLOSE enough with the new higher refresh OLEDs for non HDR retro content in combination with: https://blurbusters.com/crt-simulation-in-a-gpu-shader-looks... (which hopefully will continue to be improved.)

> The aggressive OLED ABL is simply a thermal issue.

Theoretically, yes and there’s been progress, but it’s still unsolved in practice. If someone shipped an OLED twice as thick and full of fans and heatsinks, I’d buy it tomorrow. But that’s not what the market wants, so obviously it's not what they make.

> It can be mitigated with thermal design in smaller devices, and anything that increases efficiency (be it micro lens arrays, stacked "tandem" panels, quantum dots, alternative emitter technology) will lower the thermal load and increase the max full panel brightness.

Sure, in theory. But so far the improvements (like QD-OLED or MLA) haven’t gone far enough. I already own panels using these. Beyond that, much of the tech isn’t in the display types I care about, or isn’t ready yet. Which is a pity, because the tandem based displays I have seen in usage are really decent.

That said, the latest G5 WOLEDs are the first I’d call acceptable for HDR at high APL, for the preferences I hold with very decent real scene brightness, at least in film. Sadly, I doubt we’ll see comparable performance in PC monitors until many years down the track and monitors are my preference.