←back to thread

Mini: Tonemaps (2023)

(mini.gmshaders.com)
44 points bpierre | 1 comments | | HN request time: 0.433s | source
Show context
kllrnohj ◴[] No.45308848[source]
This does seem to be very specific to Game Maker though. Like saying 0.0-1.0 is 8 bit unorm is incorrect. It's just a unorm, no bit depth is otherwise implied. Similarly HDR doesn't mean exceeding 1.0. It can mean that, but if you're going to a display directly you're going to want something more like 101010X2 unorm in PQ, which is still 0.0-1.0 in the fragment shader
replies(1): >>45310208 #
dahart ◴[] No.45310208[source]
> This does seem to be very specific to Game Maker though.

Good point. I was going to point out that in general comparing 6-bit color to float color should not result in a change in brightness, tonemapping, or anything other than maybe some visible quantizing in the 6-bit version. That first example makes sense to me if we’re talking about what Game Maker does with 6bpp vs float, and wasn’t meant to be a conceptual comparison of bit depths for shaders in any environment. So maybe GM doesn’t use the same color pipeline with LDR vs HDR colors, perhaps HDR is tonemapped and LDR isn’t? The first paragraph implies this, but doesn’t say it explicitly. Or is tonemapping in GM a user choice not determined by color format?

> Similarly HDR doesn’t mean exceeding 1.0

True, but it’s harder to see the point of HDR without having to deal with colors that are brighter than the display can handle.

There are lots of different ideas & suggested definitions around for what HDR means, but I like to think the important one is the idea of storing colors in absolute physical units, as opposed to relative units where 1.0 means one times the brightest color the display is capable of. Which, of course, you can do with any bit depth, but HDR tends to pair well with higher bit depths.

replies(1): >>45310978 #
kllrnohj ◴[] No.45310978[source]
> True, but it’s harder to see the point of HDR without having to deal with colors that are brighter than the display can handle.

1.0 is a completely arbitrary value that has no meaning. When used with sRGB 1.0 means "users brightness setting." When used with PQ, 1.0 means 10,000 nits. When used with HLG 1.0 means "idk maybe 1000 nits? nobody knows!"

You need the colorspace to be defined to know what 1.0 means and whether or not it represents an SDR or HDR value. Or if it's neither even and maps directly to the current display range.

Presumably game maker has fixed opinions here, but that gets back to "this is very game maker specific"

> but I like to think the important one is the idea of storing colors in absolute physical units,

That is not only the least important aspect of PQ HDR, it's downright wrong and the industry is fixing that bug :)

replies(1): >>45313926 #
1. dahart ◴[] No.45313926[source]
Wait, I’m confused by that, what is wrong with the idea of using physical units? Nits are an absolute physical unit. The fact that 1.0 in PQ = 10k nits and that 1.0 represents an absolute physical brightness, not a relative one, is a critically important and defining feature of that color space. You’re saying PQ is “downright wrong”? Why? How is the TV industry fixing it?

Using nits as the basis for PQ’s mapping is the entire reason PQ can stop at 1.0, because setting the max possible brightness at 10k nits is considered enough to handle what the vast majority of TVs will be able to display for the time being. If someone makes an HDR TV that handles 100k nits, PQ won’t work anymore. Using a colorspace based on an absolute physical unit like nits was the core idea behind the invention of HDR for people like Greg Ward, for photographers who capture sun in their shots, and for scientists that use digital imaging.

PQ is a color space that’s intended for TV display and distribution, but not the HDR space that production people work in, not good for HDR photography, and not really intended to be used for storage of HDR source material. If you use PQ and capture the sun in your shot and then want to re-expose it, you’re probably out of luck. If you increase brightness, you lose color resolution in the dark colors, and if you decrease it, you’ll clamp the sun to less than 10k nits. Either way, you end up with the exact same problems as re-exposing LDR images. Besides physical units, another often important feature of HDR for production CG work is linearity. PQ’s limited bit depth and nonlinear transfer function get in the way and make it not the color space of choice for production pipelines.

This is to say that PQ, while being an interesting example worthy of discussion, and a good choice for some goals, does not represent HDR as a whole nor does it serve all HDR needs, it’s not my first choice for the definition of HDR nor the canonical example of why HDR exists. Like many color spaces, PQ comes with tradeoffs and is only intended for a subset of color workflows.

> You need the colorspace to be defined to know what 1.0 means and whether or not it represents an SDR or HDR value.

Absolutely! I was really referring to the LDR color spaces where 1.0 is relative to the max brightness, which is why I qualified my statement with “relative units”. Aside from PQ, nearly all the old color spaces that stop at 1.0 are relative, not absolute… all the non-physical, non-perceptual color spaces like RGB, CMYK, HSL, HSV… and even some perceptual spaces like CIE’s XYZ, xyY, and LAB are all relative with 1.0 meaning max brightness. Color spaces for print are naturally relative and why it never makes sense to have something greater than 100%.