Tuesday, February 7, 2012

HDR dithering

So I was just busy implementing HDR in my little experimental renderer, adding support to load wavefront .obj/.mtl files and started using the Crytek sponza scene. And at this point I was playing around with the lighting a bit and then I noticed this:

Original HDR screenshot (contrast increased in image to illustrate banding)
Loads of ugly bands in poorly lit areas. And at the moment I'm using really high precision buffers everywhere (16 bit floats), so the first thought that popped up in my head was .. how can I possibly be getting precision artifacts here? So I take a screenshot and start comparing pixels; turns out there's only a difference of 1 between the bands, so it's not a precision problem; my monitor is simply poorly calibrated! (I really thought I calibrated it properly! Interestingly enough it looks perfectly smooth on my other monitor) So my next thought was.. well most people out there will have poorly calibrated monitors, so is there a way to improve on this? Yes there is! Dithering! I remembered that in the good old days of 4, 16 and 256 color video modes people used to use dithering to simulate smooth crossovers from one color to another, and the higher the resolution, the better it worked. Long story short, I added some dithering to my shader and it worked like a charm:

HDR with dithering (contrast increased in image)
And finally here's the same screenshot without it's contrast increased:

HDR with dithering (original contrast)
Now when I turn contrast and brightness all the way to the highest levels on my monitor, it still looks nice and smooth.

Update

@renderwonk (Natty Hoffman) told me on twitter:
Pixar dither when quantizing. See "Mastering" section of "Color Pipelines..." course here instead of adding 0.5 and truncating to quantize, they add rand(0,1) and truncate
So I just tried using random noise instead of dithering, and it looks better, although obviously more noisy. (No screenshot I'm afraid) When the noise changes over time it visually blends together. It makes the color transitions much smoother and all kinds of faint details, that where simply a single color before, end up being more visible.

Unlike Pixar I made the noise scale somewhat with the amount of light, this made the noise a bit more visible (especially on a monitor with a very high contrast & brightness, but not too much) but it also smoothed out the more brighter transitions. Interestingly enough the noise looks similar to the noise that you would see when you're in a poorly lit environment. Although the noise/dithering won't scale the extend of the contrast of your display device, it does extend the perceived color precision.