Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
jval

Sector lightlevels in 32 bit color rendering

Recommended Posts

Sector lightlevels in Doom can vary from 0 to 255, but the vanilla renderer shifts the value 4 bits:

    light = sector->lightlevel >> LIGHTSEGSHIFT)+extralight;

    if (light >= LIGHTLEVELS)
        light = LIGHTLEVELS-1;

    if (light < 0)
        light = 0;
This way e.g. sector lightlevel of 160 equals the visual output of sector lightlevel 175. Only 16 (==LIGHTLEVELS) distinct "lightlevels" can be used.

Should a 32 bit color renderer (software or not) remove this limitation and use precise lightlevel or, for compatibility or other issues, must continue to replicate the above limitation?
Or should leave this to the end-user to decide?

Share this post


Link to post

In Mocha Doom, when implementing the extended colormap system, I also alter the value of that constant, which as you can see is also interdependent of others:

	/** Bits representing color levels. 5 for 32. */
	public static final int LBITS;

        ...

        LIGHTLEVELS = 1<<LBITS;
        MAXLIGHTZ=Math.max(LIGHTLEVELS*4,256);
        LIGHTBRIGHT= 2;
        LIGHTSEGSHIFT = 8-LBITS;
        NUMCOLORMAPS=     LIGHTLEVELS;
        MAXLIGHTSCALE = 3*LIGHTLEVELS/2;
        LIGHTSCALESHIFT = 17        -LBITS;
        LIGHTZSHIFT=25-LBITS;
These formulas are empirical, and for LBITS=5 they give the default values found in the linuxdoom source code. IMO, this is not really something that should be up to the user to decide. It's something just as internal/intimate to the engine as the e.g. zig-zag path used by the monsters or the exact pixel pattern used by the pinkies' blur effect, only more critical visually.

Share this post


Link to post

I see, changing just LBITS const at your source you can increase/decrease the accuracy of light levels.

I have the impression that a value of LBITS=4 will give the vanilla defaults, 5 will result the following quote from r_main.h, or do I miss something?

// Now why not 32 levels here?

Share this post


Link to post

Yeah, that is a well-known inconsistency between "actual vanilla Doom" and the released source code. It should be LBITS=5 if one wants all 32 colormaps to be used.

Share this post


Link to post

Have 32 bit draw in DoomLegacy, and all that light code got rewritten.
There are light calculations that can be moved out of the loops.
I think the only reason for the shift down is so they could index a limited number of light tables. Drawing in 32 bit should use every light bit there is, there are no effects that would affect compatibility, just get better rendering. It would make a difference if you cannot throw the full light range into your 32 bit RGB calculations without getting overflow problems.

Share this post


Link to post

May or may not be a little late for me to chime in on this topic, but there really isn't a whole lot of difference, visually, between 16/32-levels of light vs 256, unless you're one of those obsessive mappers who tries to create lighting gradients. Keeping in mind, the original 256 color palette tables in DOS were limited to 64 shades per color to begin with.

Therefore, I see no real reason for true-color source ports to limit the amount of sector shading that there is available.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×