Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
_bruce_

Doom in true colour [ testers for exe welcome/src snapshot/links updated ]

Recommended Posts

Lüt said:

Wow, you just fixed the biggest problem I've had with Doom since I first played it (and singlehandedly saved E3M3 in the process). Smooth looks especially great.


Have you ever tried one of the many OpenGL ports? :p

Share this post


Link to post

Smooth looks really fantastic, I'd be interested in merging it with my Choco fork, so long as it isn't too much of a pain to work with. :)

Share this post


Link to post
_bruce_ said:

) The zombie man is fullbright now when shooting... added this due to an email from the league of neglected Doom sprites.


Perhaps consider making the arachnotron not fullbright in its initial aiming frame. Was always weird how it'd light up before it even fired.

Share this post


Link to post
Gez said:

Have you ever tried one of the many OpenGL ports? :p

Yeah, awkward sprite handling and bad blur-management until recent years. Traded one issue for another.

Share this post


Link to post
Lüt said:

Yeah, awkward sprite handling and bad blur-management until recent years. Traded one issue for another.

Also, even in GZDoom the fake contrast and depth fog is almost unnoticable. There are an awful lot of levels that look dull under any GL port.


_Bruce_: Have you thought about adding reverse gamma correction? On a lot of LCD screens Doom looks washed out because of how bright they are. In ZDoom I often have to set gamma to 0.8 to make everything look "normal".

Share this post


Link to post
Wagi said:

Also, even in GZDoom the fake contrast and depth fog is almost unnoticable. There are an awful lot of levels that look dull under any GL port.



I never had issues with the fake contrast. Yes, it's only half as strong as with the software renderer, but that was done because it'd look exaggerated and unnatural if done at full strength.

As for the depth fog, no it's not supposed to replace the light fading in the software renderer. Actually it can't because it's something completely different.

The brightening around the player(which is not depth fog, btw) is indeed weaker but again that was done because doing it at the same strength as the software renderer does would make it look like the player was a strong light source. Again something that just does not work.

Doom's lighting is weird. It makes no sense in any technical way and once you try to do it with technology that's supposed to be closer to reality it just shows its flaws.

Share this post


Link to post

That's exactly what I was saying, though. GL ports can't emulate those things without looking wierd, no matter how good a job is done with them. It goes without saying that levels designed around Doom's unnatural lighting are going to look better under a software renderer than under the more realistic OpenGL and vice versa.

Hell, it's for this very reason that some levels will probably lose a lot of the luster when put into truecolor --- Doom's tan color range actually gains saturation as it fades into darkness --- so some levels depend on 8-bit color to look vibrant.

Share this post


Link to post

Settled for 3 shading modes - classic, classic fine, calculated fine.
Palettes are now fully blended and work at the same time - thx to Wagi for the idea. Still have to finetune the look.

I hate Doom64's artwork for the most part but some of the things got to me so I will add the nightmare spectre.

One additional powerup sphere and a green key/skull.

I will have to make a config for Doombuilder1/2 to test the sector lighting and the additonal monster/s.

The niche of this Doom version is a "playboy" version of the original.

Speed is at about 70% of choco. if the OpenGl mini scaler is used... faster but not as accurate as Fraggle's scaler.


Megamur -
will add it to the setup... don't want it in the options menu.



Wagi -
excellent observation on Doom's visual characteristics. I-Gamma included - 5 levels... excellent idea!



Lüt
- Red bricks... red hot.



Jaguar mode as of now is a gag, but I'm thinking of uploading the textures which were used in the jaguar version addtionally to the normal ones - if chose low detail you get jag'ed.



Ralphis
- due to the high contrast there's a "compressed" look to the visuals.
contrast3 via normal Doom - 108 unique colors

contrast3 via Doom** - 751 unique colours



Colored lighting would be nice... maybe I can squeeze it in... if then like Quake's .lit formatin but the wad itself so that a normal exe doesn't get annoyed.
http://i.imgur.com/SFqyf.png


Skulls says... enough for today.

Share this post


Link to post

Ok, colored ligthing is in. Changes to the source base are pretty minimal - I was amazed how smooth it went.
It uses the HSV color space to do it's deed and needs some more thought because conversion is costly as of now.

Hint - skip to pic unless you're in for a cup of coffee.
Render modes(raw, fine, calcfine + color) are switched via keypad '+' - since there were no more F keys free.
Colored lighting is only supported in calculated shade mode to be able to enjoy the map without it if you get a visual headache - colored lighting is one of those abuse features.

Color info and light level info are kept separate - embodies more the classic nature of the project.
Visplane limits have to be raised because sector color ups the vis plane count.

Added some snide remarks to the end messages and cleaned up some stuff, added cpu detection with a nice little visual hint. Auto map background is now game dependend... just little things here and there.
As of now I have no tool to properly define the sector colors so the shot is just random sector color.




Share this post


Link to post

Me too! I hope that the changes can easily be translated to something like Odamex one day :)

Share this post


Link to post
Ralphis said:

Me too! I hope that the changes can easily be translated to something like Odamex one day :)


They should be easily translated to any software rendered port -and in fact, this is not the first of its kind (cf. Delphidoom, at least the older non-OpenGL versions).

The catch is that this kind of per-pixel processing is best described as a "processing luxury": once you accept that every pixel is to have its own proper colorspace processing overhead, you automatically throw one of the major advantages of Doom's software renderer out of the window: the ability to quickly and efficiently apply lighting & coloring effects just by manipulating indexes in a palette. The next step would be to go full ray-tracing, which takes this to another level (and in fact, there was/is a fork of Chocolate Doom that did just that).

What's the problem you say? Getting decent framerates out of such processing extensions is very challenging (Delphidoom struggled to catch up with 30-something framerates, if I recall correctly), especially if you go through more than one color space transformation or a more complex lighting scheme. Perhaps this is one place where a multithreaded renderer would help, as now the ratio of per-pixel processing to memory bandwidth will have definitively increased, while the power that a single core can churn out has practically hit the limit for years now.

That's why I'd like to try an extension of the indexed rendering model up to 16 bits, hoping that there's a "sweet spot" somewhere between 8 and 16 bits where the renderer will look smoother enough, tricks like colored lighting/transparency won't look like ass, and it will still be possible to get decent framerates without resorting to full colorspace processing.

Share this post


Link to post

I don't think that will ever work. The lookup tables would just be too large with any indexed >8 bit format.

The only reason it works with 8 bit is that the lookup tables are small. For example, a translucency lookup table for a single alpha value will be 64k. (256*256) With an indexed 16 bit format it will be 4GB. You'd never be able to handle that.

Share this post


Link to post

That's why I mentioned a hypothetical "sweet spot" between 8 and 16 bits. It might be 9 or 10 bits for all we know, but unless it's actually tried out, we'll never know.

Share this post


Link to post

Most color operations can be done with a single 64k lookup table and only a few math operations. If you have a LUT where lut[i][j] == (i*j)/255, you can use that do do some rather quick effects.

For example, "additive" translucency would be lut[r1^0xff][r2^0xff]^0xff, and so on for each channel. "Subtractive" translucency, ie the kind where Magenta + Yellow = Red, is even simpler.

Share this post


Link to post
Wagi said:

Most color operations can be done with a single 64k lookup table and only a few math operations.


Depends on which operations, and with what color model.

With traditional 256 fixed color renderers, sure, you can make a 256 * 256 translucency or other such table which maps pairs of colors to another color, but determining which one in particular in order to obtain a particular effect is not really trivial: Boom derivatives have a whole function dedicated to pre-computing this table (or pre-load it, since it's really time consuming), and it involves taking any two palette colors, converting them to RGB, mixing them in RGB, and then checking which one of the EXISTING palette colors better matches their mixture. And then again the table will only be 100% valid for only one palette (e.g. the green radsuit palette would need its own table, but this is often not done).

And you have to do this 65K times (or 65K/2, at least), with a running time of O(n^3) where n is the number of colors, and a space requirement of O(n^2).

Stuff like Delphidoom or _bruce_'s port just do those operations in real-time in non-indexed color spaces. Once you deem the performance overhead as "acceptable", sure, you no longer have to use palettes or LUTs for everything (though a mixed mode is possible, e.g. apply colorization/transparency in "full processing mode" but still use LUTs for certain things)

Share this post


Link to post
Maes said:

And then again the table will only be 100% valid for only one palette (e.g. the green radsuit palette would need its own table, but this is often not done).

Why? I think that's correct. It's meant as an after-effect, so you need to calculate the resultant color with usual palette and then apply the green tint. That's what the ports are doing.

Maes said:

And you have to do this 65K times (or 65K/2, at least), with a running time of O(n^3) where n is the number of colors, and a space requirement of O(n^2).

the naive color-matching algorithm is slow yes, but I think table generation can be sped up to O(n^(2+eps)).

Share this post


Link to post

Such tables can be useful to put guard bits into the RGB representation, so that math operations can be performed. (This is a known technique, I have not looked at their code to see it they use it).
If the texture is stored as an 8 or 9 bit color index, then lookup a 10bit/color RGB (2 guard bits)(32bit table).
Can then add colors without bleeding Green into Red.
A mask operation to clean up, can shift left and right.
Can multiply by constant (5,5,5) in same 10bit format.
Can multiply by tint (8,3,1) in same 10bit format
The most expensive operation is then to convert (10,10,10) to (8,8,8) or
(5,6,5) or (5,5,5) for display. I have seen tables used for that too (each color applied individually), so that the math has limits at max and min instead of overflowing.

Such a table is not too large that cannot have several with precomputed results too, especially for the the tint and conversion to output format.

With the table used for math operations idea. It is not necessary to convert back to the one palette. There can be multiple palettes, some larger than others. Just have to keep track which palette index a table gives as a result, and which other tables it can be used upon.
A 10bit palette index is space wasteful, but an table that gives 8 bit green-palette index takes less space and the green-palette lookup table (to 16bit color) takes no more space that if it was a sub-section within a 10bit palette table. The palette-id can be a variable over the color translation too, identifying which output palette the final 16bit color should be taken from. This generates many smaller tables, some of which can be eliminated by testing that palette-id byte for degenerate cases. Having one large 10bit final table may share colors between operations, but really, how many operations upon 256 colors are going to end up with the same color out of 16M. Have the intensity modulation tables output the same palette that they input and most of that problem will be minimal.

Share this post


Link to post
Maes said:

Depends on which operations, and with what color model.

I think you misunderstood what I said. I'm not talking about an 8-bit TRANMAP. I'm talking about a lookup table that can, with perfect accuracy, be used to assist color calculations in an RGBA colorspace. As long as each channel (red, green, blue, alpha) each have one byte dedicated to them, it will work, and quickly (at least quickly for something that's not hardware accelerated). You're right, though, in that such a lookup table would be useless in a HSV colorspace.

The "different palettes" problem can be solved by simply drawing with the palette being used. This is what my truecolor port does. For example, it would pull from a palette that already has the radsuit effect applied, and then blend it with the "black" for that palette (which would in this case be a very dark green).

Share this post


Link to post
tempun said:

Why? I think that's correct. It's meant as an after-effect, so you need to calculate the resultant color with usual palette and then apply the green tint.


Not if we're talking precalc indexed (paletted) tables here. Color index #X and #Y may give color index #Z for palette 0, but may give a completely different color for a green-tinted palette (which is, for all effects and purposes, a different palette). Imagine the above scenario with one of the almost all-red pain palettes ;-)

Of course, if we are talking about full colorspace processing and accepting any resulting overheads as inevitable, the above point is moot, like most of the indexed limitations.

tempun said:

That's what the ports are doing.the naive color-matching algorithm is slow yes, but I think table generation can be sped up to O(n^(2+eps)).


Well, the O(n^2) part will still be dominant for a table of rank n, no matter how quickly you can produce the matching for any two given colors, even if it's O(1) per pair.

Wagi said:

I think you misunderstood what I said. I'm not talking about an 8-bit TRANMAP. I'm talking about a lookup table that can, with perfect accuracy, be used to assist color calculations in an RGBA colorspace. As long as each channel (red, green, blue, alpha) each have one byte dedicated to them, it will work, and quickly (at least quickly for something that's not hardware accelerated). You're right, though, in that such a lookup table would be useless in a HSV colorspace.


Hmm....that's actually a good workaround at least for the RGB color space, but only if the three color channels are really truly independent. E.g. in the case of simple average transparency T between two pixels X and Y, akin to what TRANMAP is doing, you could say that e.g. the Red transparency pixel only depends on the Red pixel of X and the Red pixel of Y or Tr~XrYr, therefore you could build a LUT only for the red channel, and so on for the others. So a "truecolour transparency LUT" that performs accurate RGB truecolour average would "only" require 3x64K tables, which is indeed not bad considering that with a single 64K table you only cover the indexed case of a specific palette.

The problems start if you start considering effects where e.g. Tr is NOT independent of Xg, Xb, Yg and Yb. In that case, you will need 9 such tables (every possible combination of e.g. Xi*Yj where i,j={r,g,b} plus an additional weight matrix.

The simple transparency table can be thought of as a special case where only the cases Xi*Yj with i==j are retained (diagonal system), and where the weights matrix is unitary.

Edit: heh I was about to draw some matrices and formulas here, but nm ;-)

LUTs are OK if the effects you are using them for are between completely orthogonal channels. However, if you tried to make an effect that's based on the values of all three channels with RGB LUTs (e.g. their overall luminosity), then you'd be in deep shit, and need to use a heavyweight full 3x3 system of matrices plus weights.

However, you could construct a lightweight "sparse" system using similar tables for HSV, where it would be easier to implements certain effects like global hue or luminosity shifts. OTOH, transparency would be awkward in such a color system.

Heh Doom: Photoshop D.I.P. edition? ;-)

Share this post


Link to post

"only" require 3x64K tables

Nah, you can reuse the same lookup table for each channel.

Share this post


Link to post
Wagi said:

Nah, you can reuse the same lookup table for each channel.


Heh, good point, but only for this very specific effect of isotropic, symmetric transparency ;-) So you would have a further simplified (diagonal, unitarily-weighed AND value-aliased system) ;-)

Share this post


Link to post

Well, the kinds of transparency where you have the channels codependent on eachother are extremely limited in use anyway. I doubt anybody will be using an effect that shifts the hue (like the lost soul in the screenshot) in an actual level, unless they're making a Doom level where you're on LSD. Except for when the "desaturation" value is set in Sector_SetColor, even ZDoom sticks exclusively to "isotropic, symmetric transparency". Like I said, even the "Additive" draw mode can be replicated with the aforementioned LUT.

Share this post


Link to post
_bruce_ said:

As of now I have no tool to properly define the sector colors so the shot is just random sector color.


If it weren't a Chocolate Doom hack, I'd suggest porting over the ColorSetting thing from ZDoom, but you can't set args in the Doom format.

You'll pretty much have to keep it being random, write up some heuristics based on the textures used, or create some additional control lump external to the map like SECTINFO or EE's ExtraData. (Or implement UDMF support. Or port your work to ZDoom. :p)

Share this post


Link to post
_bruce_ said:

As of now I have no tool to properly define the sector colors so the shot is just random sector color.


You can take a lesson from ports like CDoom or RoRDoom, which use indirect ways to encode extra information about a sector without changing the map format or requiring extra lumps, while remaining compatible with editors and non-extended ports (to a degree): in order to define its "Room over Room" sectors, cDoom maps contain dummy sectors outside the play area, and links their linedefs to the the desired sector via its sector tag. Said linedefs have special values not normally used by the Doom engine (or by Boom, but that's not an absolute).

E.g. you could encode the color information for a sector using three new sector actions with numbers 0x100, 0x200 and 0x400 for RGBin hexadecimal, and encode the color information in the linedef action number e.g. white would be 0x1FF, 0x2FF, 0x4FF. Of course that means reserving a good deal of linedef action values.

You would make a dummy triangular sector outside the map, containing those three linedefs with said sector actions, and pointing to sector you want the effect applied through its tag. During load time, you search for such special linedefs and dummy sectors, decode them, and apply the effects as appropriate.

It's easier to see than to explain, so try opening a cDoom map (e.g. multi.wad, multi2.wad) and see how the "room over room" effects are achieved. You can use the same trick for encoding RGB sector colors (you could also encode this or even additional information in the height of the dummy sectors, for instance).

Share this post


Link to post

Boom used texture name for color blend in its colormap specials. Then you only need one special. Seems a lot safer than grabbing entire ranges and splitting RGB values on three different lines.

Share this post


Link to post

Yeah, thats a much better solution. Doomsday allows doing something similar with XG in fact (encoding values used for other purposes into otherwise unused members of the map data structures).

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×