←back to thread

What is HDR, anyway?

(www.lux.camera)
791 points _kush | 1 comments | | HN request time: 0.202s | source
Show context
aidenn0 ◴[] No.43985366[source]
> A big problem is that it costs the TV, Film, and Photography industries billions of dollars (and a bajillion hours of work) to upgrade their infrastructure. For context, it took well over a decade for HDTV to reach critical mass.

This is also true for consumers. I don't own a single 4k or HDR display. I probably won't own an HDR display until my TV dies, and I probably won't own a 4k display until I replace my work screen, at which point I'll also replace one of my home screens so I can remote into it without scaling.

replies(7): >>43985419 #>>43985522 #>>43985991 #>>43986618 #>>43986876 #>>43990252 #>>43994476 #
gwbas1c ◴[] No.43985991[source]
> I don't own a single 4k or HDR display

Don't feel like you have to. I bought a giant fancy TV with it, and even though it's impressive, it's kinda like ultra-hifi-audio. I don't miss it when I watch the same show on one of my older TVs.

If you ever do get it, I suggest doing for a TV that you watch with your full attention, and watching TV / movies in the dark. It's not very useful on a TV that you might turn on while doing housework; but very useful when you are actively watching TV with your full attention.

replies(2): >>43986133 #>>43986713 #
anon7000 ◴[] No.43986713[source]
I totally love HDR on my OLED TV, and definitely miss it on others.

Like a lot of things, it’s weird how some people are more sensitive to visual changes. For example:

- At this point, I need 120hz displays. I can easily notice when my wife’s phone is in power saver mode at 60hz.

- 4k vs 1080p. This is certainly more subtle, but I definitely miss detail in lower res content.

- High bitrate. This is way more important than 4k vs 1080p or even HDR. But it’s so easy to tell when YouTube lowers the quality setting on me, or when a TV show is streaming at a crappy bitrate.

- HDR is tricky, because it relies completely on the content creator to do a good job producing HDR video. When done well, the image basically sparkles, water looks actually wet, parts of the image basically glow… it looks so good.

I 100% miss this HDR watching equivalent content on other displays. The problem is that a lot of content isn’t produced to take advantage of this very well. The HDR 4k Blu-ray of several Harry Potter movies, for example, has extremely muted colors and dark scenes… so how is the image going to pop? I’m glad we’re seeing more movies rely on bright colors and rich, contrasty color grading. There are so many old film restorations that look excellent in HDR because the original color grade had rich, detailed, contrasty colors.

On top of that, budget HDR implementations, ESPECIALLY in PC monitors, just don’t get very bright. Which means their HDR is basically useless. It’s impossible to replicate the “shiny, wet look” of really good HDR water if the screen can’t get bright enough to make it look shiny. Plus, it needs to be selective about what gets bright, and cheap TVs don’t have a lot of backlighting zones to make that happen very well.

So whereas I can plug in a 4k 120hz monitor and immediately see the benefit in everything I do for normal PC stuff, you can’t get that with HDR unless you have good source material and a decent display.

replies(1): >>43990184 #
1. gwbas1c ◴[] No.43990184[source]
> At this point, I need 120hz displays. I can easily notice when my wife’s phone is in power saver mode at 60hz.

Yeah, the judder is a lot more noticeable on older TVs now that I have a 120hz TV. IMO, CRTs handled this the best, but I'm not going back.