Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Sign in to follow this  
Linguica

Your translucency is fine

Recommended Posts

Hi, I think a lot of Doom-engine things do translucency incorrectly.

Here's the standard way people do translucency in pseudocode:

newred = (redforeground * alpha) + (redbackground * (1-alpha))
newgreen = (greenforeground * alpha) + (greenbackground * (1-alpha))
newblue = (blueforeground * alpha) + (bluebackground * (1-alpha))
This makes intuitive sense - if you are doing a 50% translucent black surface, with white behind it, you do (0 * 0.5) + (255 * 0.5) = medium gray of 127.

The problem, though, is that a 50% mix of black and white is NOT 127. Below are three squares. On the left is a square of RGB 127. On the right is a square of RGB 187. And in the middle of a GIF of alternating black and white squares shrinking to form a grayish surface. Surely, if you squint your eyes, medium gray should be the same shade as alternating black and white pixels, right?



But unfortunately, medium gray is NOT the same as RGB 127. If your monitor is well-calibrated, the middle GIF should become basically indistinguishable from the right square of RGB 187.

Why is this? Well, in short, because your monitor is calibrated for a gamma of 2.2. Back in the days of CRTs, the computer will send a voltage signal to the CRT saying, "hey, light this next pixel at an power of 50%." And the CRT would happily oblige and drive the pixel at 50% of its max power. The problem is that CRT phosphors didn't have a linear response to this - a CRT pushing 50% of power into a pixel would only illuminate it to around 20% of its maximum brightness. To compensate for this, engineers realized they had to change the values they were sending - if you wanted a pixel to illuminate at 50% brightness, you would have to say, "hey CRT, illuminate this next pixel at 73% power!" and the CRT would do so, and 73% power just happened to make the CRT phosphor glow at 50% brightness. This was accomplished by raising the intended output intensity to a power of (1 / 2.2) or so, where the value of the power was written as a gamma symbol; hence, gamma correction.



(Of course, when we switched from analog CRTs to LCD panels, there was no reason to keep using this hacky workaround, but by then it was far too late to switch.)

Both Heretic and Hexen's TINTTAB lumps do a naive blending instead of taking gamma into account. I made some quick alternate TINTTABs to show the difference:


http://www.doomworld.com/linguica/colormaps/heretint.wad


http://www.doomworld.com/linguica/colormaps/hextint.wad

The HTRANMAP lump included in boomedit.wad is also incorrect:



Some, if not all, Doom source ports also generate translucency incorrectly; Eternity and Doom Retro both do it in gamma color space. I wasn't able to actually find the translucency code in PrBoom, and ZDoom does some alternate method that I couldn't easily figure out, but I wouldn't be surprised if it was in gamma space too. I also have no idea how hardware ports do it.

A fix for this is relatively simple - when building a lookup table, before blending the two colors together, convert their RGB components to a fraction between 0 and 1 and raise them to a power of 2.2. Average the colors normally at this point, and then convert back to normal by raising the result to the power of 1/2.2, or about 0.45.

Share this post


Link to post

Isn't this making assumptions about the nature of the display? It's fast and convenient to treat RGB as if 0 is pure black, 255 is pure white, and the scale in between is linear with respect to human perception, even if that is not true. But who knows what the next display technology to come out will do to the colors of today's software? It's already a zoo out there between different source ports and what a given scene looks like in them due to different APIs in use or decisions made by the authors that change things such as distance fading or gamma correction levels.

I think changing it is just asking for trouble.

Share this post


Link to post

I think basically everyone assumes a gamma of 2.2, see, e.g., http://http.developer.nvidia.com/GPUGems3/gpugems3_ch24.html

Carmack also came around to gamma correcting colors: http://fabiensanglard.net/quake2/quake2_software_renderer.php (search for "gamma").

In the future, yes, we might have 8K displays with 10-bit or 12-bit color, like https://en.wikipedia.org/wiki/Rec._2020 is trying to start addressing. Engineers are recommending standardizing on a gamma of 2.4 for future technologies. But that's still involving gamma correction, and not linear.

Share this post


Link to post
Linguica said:

If your monitor is well-calibrated, the middle GIF should become basically indistinguishable from the right square of RGB 187.

Why is this? Well, in short, because your monitor is calibrated for a gamma of 2.2.

Stands out quite clearly on my monitor, which was recently calibrated for gamma 2.2 using a Spyder5 colorimeter.

Share this post


Link to post

Uh, OK? Gray 187 being 50% white is, like, a mathematical outcome of having a color space with gamma 2.2, so I don't know what to tell you.

Share this post


Link to post

One of life's little mysteries. At the moment there's too much light in the room to check the calibration, but I could try doing a display analysis, which might just tell me to buy a better monitor. :D


EDIT - My Samsung SyncMaster SA450 scores as follows -

Gamut - 4.5/5
Tone Response - 1/5 (gamma presets are too dark! Mode 2 came closest to the 2.2 reference curve)
White Point - 3/5 (default colour temperature is 7500)
Contrast - 5/5
Luminance Uniformity - 2.5/5
Colour Uniformity - 3.5/5
Colour Accuracy - 3.5/5
Overall Rating - 3.5/5


Might be buying myself a new monitor for Xmas.

Share this post


Link to post

At this point I think the linear RGB use of the Doom engine is much like its 1:1.2 pixel size, you can't change it without making it look wrong.

Share this post


Link to post

I did more research and turns out although blending in a linear color space is the "correct" way to do it, that doesn't necessarily mean it's always the "best" way to do it. I don't really feel like writing up another whole thing so here's a dump of various links I found interesting:

http://s3.artbeats.com/articles/pdf/linear_light.pdf

http://prolost.com/blog/2006/6/4/know-when-to-log-em-know-when-to-lin-em.html

http://www.keenlive.com/renderbreak/2013/10/premiere-pro-linear-compositing-and-single-source-cross-dissolves/

http://www.provideocoalition.com/gamma_intro

http://ssp.impulsetrain.com/translucency.html

http://hacksoflife.blogspot.com/2012/11/deferred-weirdness-when-not-to-be-linear.html

Share this post


Link to post

From that last link:

Traditional alpha blending is an easy for artists to create effects that would be too complex to simulate directly on the GPU. For example, the windows of an airplane are a witches brew of specularity, reflection, refraction, absorbtion, and the BDRF changes based on the amount of grime on the various parts of the glass. Or you can just let your artist make a translucent texture and blend it.

But this 'photo composited' style blending that provides such a useful way to simulate effects also requires sRGB blending, not linear blending. When an 'overlay' texture is blended linear the results are too bright, and it is hard to tell which layer is "on top". The effect looks a little bit like variable transparency with brightness, but the amount of the effect depends on the background. It's not what the art guys want.

That's a good explanation of what I felt when looking at the alternating BOOMEDIT screenshot.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  
×