←back to thread

HDR‑Infused Emoji

(sharpletters.net)
274 points tabletcorry | 6 comments | | HN request time: 0.959s | source | bottom
Show context
ionwake ◴[] No.43718808[source]
Sorry for the noob question but I think finally someone in this thread can answer this for me. Sometimes when I see a youtube short video it looks like its HDR is whacked up by like 500% as per the image in this page, but Im confused how this could be done. Is video processing on the video before it is uploaded somehow giving it some sort of encoding which chrome just wacks up? Or is it the hardware doing it and encoding it a certain way?

I am not talking about a slight brightness increase, I am talking Ill be scrolling youtube and suddenly this video is like a portal into another dimension its so bright.

Can anyone explain how its done?

replies(3): >>43718878 #>>43718922 #>>43719470 #
1. harrall ◴[] No.43718878[source]
Screens can't often do full brightness on the whole screen so if you come across a video or image that is supposed to have a higher contrast ratio, the system will darken everything and then brighten up the pixels that are supposed to be brighter.

Yes, there are formats that able to store a higher contrast ratio so that's why it doesn't happen on non-HDR content but the actual brightening of a portal on your screen isn't because of the format but because of your hardware (and software) choosing to interpret the format that way.

For more a practical example, if you had an 8-bit HDR image, 255 on the red channel (after inputting this number through a math function like HLG[1] to "extract" a brightness number) might mean "make this pixel really bright red" whereas 255 on a SDR format would mean "just regular red." However, each red channel is still a number between 0 and 255 on both formats but your hardware decided to make it brighter on the HDR format.

(Although in reality, HDR formats are often 10-bit or higher because 256 values is not enough range to store both color and brightness so you would see banding[2]. Also, I have been using RGB for my example but you can store color/brightness number many other ways, such as with chroma subsampling[3], especially when you realize human eyes are more sensitive to some colors more than others so you could "devote fewer bits" to some colors.)

[1] https://en.wikipedia.org/wiki/Hybrid_log%E2%80%93gamma

[2] https://en.wikipedia.org/wiki/Colour_banding

[3] https://en.wikipedia.org/wiki/Chroma_subsampling

replies(3): >>43718990 #>>43719313 #>>43720127 #
2. ionwake ◴[] No.43718990[source]
Thank you so much for your reply - I will look into it!
3. kllrnohj ◴[] No.43719313[source]
> Screens can't often do full brightness on the whole screen so if you come across a video or image that is supposed to have a higher contrast ratio, the system will darken everything and then brighten up the pixels that are supposed to be brighter.

There's no system that does that. The only thing that's kinda similar is at the display level there's a concept known as the "window size" since many displays cannot show peak brightness across the entire display. If you've ever seen brightness talked about in context of a "5%" or "10%" window size, this is what it means - the brightness the display can do when only 5% of the display is max-white, and the rest is black.

But outside of fullscreen this doesn't tend to be much of any issue in practice, and it depends on the display.

replies(1): >>43723349 #
4. dr_kiszonka ◴[] No.43720127[source]
I use YouTube on my rather inexpensive TV. When a thumbnail of an HDR video starts playing, the whole screen brightens up significantly. I don't know as much about HDR as you do, so maybe they are using some other perceptual trick. It might also not be "full brightness." BTW, why can't screens do full brightness on the whole screen?
5. drodgers ◴[] No.43723349[source]
> There's no system that does that.

You mean the darkening of everything else to highlight bright HDR areas? All recent Macs do, including the one I'm typing on right now. It's a little disconcerting the first time it happens, but the effect is actually great!

replies(1): >>43724610 #
6. kllrnohj ◴[] No.43724610{3}[source]
Apple doesn't darken SDR to amplify HDR. They keep SDR the same brightness as it was before HDR showed up. It appears like SDR gets dimmer because of your perception of contrast, but it's not actually within a small margin of error.