Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Janizdreg

Chocolate Doom

Recommended Posts

You don't need a modern video card or modern anything to run legacy software, in fact, this glitch just goes to show that newer=worse, and many people acknowledge this fact, deliberately keeping their hardware "outdated" to retain compatibility/usability.

It's never a good thing to generalize on stuff like this.

Share this post


Link to post

One can also say that 'keeping old hardware == bad', especially if the software you are concerned about is still being developed. You are not doing the developer a favor by that.

Share this post


Link to post

Dying? I thought they were effectively buried years ago on the home desktop (discounting specialist video chipsets which less than 1% of the potential user base will own).

The only benefit to using an 8bit colour mode is the implicit "palettization". These days the same can probably be achieved more efficiently in a GL renderer in low resolution 24bit colour and using a fragment shader to do the palette quantization (especially if you're pixel-doubling up the final render) - but is this even desirable?

What most people in the DOOM community actually want from 8bit colour mode (at least in a GL renderer) is a shared texture palette - which, is still supported in OpenGL through other means (giving a healthy video memory saving to boot). In fact, this is actually the best way to handle DOOM's global PLAYPAL in a GL renderer. The problems with this method only surfaces when you want to mix paletted and "true colour" textures in the same scene. This is why we don't use paletted textures in Doomsday by default (its a command line option).

Share this post


Link to post
Porsche Monty said:

You don't need a modern video card or modern anything to run legacy software, in fact, this glitch just goes to show that newer=worse, and many people acknowledge this fact, deliberately keeping their hardware "outdated" to retain compatibility/usability.


Forget DirectX11 and OpenGL 4.1, this graphics card/operating system is worse because it has a corner-case issue with a 15+ year old video mode that nobody really uses anymore? Really?

Besides, forcing an 8-bit palette when you're fullscreen just makes things look ugly when you're alt-tabbing out. I actually like whatever solution ZDoom does in software mode, though from what I understand it bypasses SDL for native Windows drawing functions, so maybe it's an SDL issue.

Share this post


Link to post
Graf Zahl said:

One can also say that 'keeping old hardware == bad', especially if the software you are concerned about is still being developed. You are not doing the developer a favor by that.


It really depends on the complexity of the project and the hardware/os the vast majority of people actually use.

In any case, mentioning "gl" and "shaders" in a Chocolate Doom discussion is stepping straight into the bollocks zone.

Share this post


Link to post
AlexMax said:

Besides, forcing an 8-bit palette when you're fullscreen just makes things look ugly when you're alt-tabbing out


I don't get any of that. Must be your crap computer.

Share this post


Link to post
Porsche Monty said:

In any case, mentioning "gl" and "shaders" in a Chocolate Doom discussion is stepping straight into the bollocks zone.

I don't see how. Zdoom uses DirectDraw when doing software rendering ;) I see no reason why screen-space fragment shaders cannot be utilized over soft-rendered scene components.

Share this post


Link to post
Porsche Monty said:

In any case, mentioning "gl" and "shaders" in a Chocolate Doom discussion is stepping straight into the bollocks zone.


Why do you care what is driving the display as long as the end result looks the same? Using OpenGL does not automatically mean you're on a one way trip to looking like GZDoom.

Porsche Monty said:

I don't get any of that. Must be your crap computer.


Pretty much every source port I've ever used that has actually used a true 8-bit mode for fullscreen has done this, on any Windows computer I've ever owned. Linux and MacOS have not, but I suspect they upscale the window to 32-bit colors on their own (which might explain why when randy originally ported ZDoom to Linux an age and a half ago that he got such poor performance).

Share this post


Link to post
DaniJ said:

I don't see how. Zdoom uses DirectDraw when doing software rendering ;) I see no reason why screen-space fragment shaders cannot be utilized over soft-rendered scene components.


You realize how inefficiently shaders perform on legacy hardware (e.g. GeForce 6), do you?

Share this post


Link to post
DaniJ said:

I don't see how. Zdoom uses DirectDraw when doing software rendering ;) I see no reason why screen-space fragment shaders cannot be utilized over soft-rendered scene components.


Actually, it uses D3D and a shader to render the scene as a paletted 8 bit texture. DirectDraw is only used as a fallback on old hardware.

Share this post


Link to post
Porsche Monty said:

You realize how inefficiently shaders perform on legacy hardware (e.g. GeForce 6), do you?


Which begs the question: How many people still stick to such old cards? I think the ones not wanting to upgrade use even older hardware. BTW, ZDoom's shader-based renderer still works with acceptable performance on my old system, dated 2004, which has a GF6800. There's far worse bottlenecks on that system that slow things down than a slight performance loss by using a shader.

Share this post


Link to post
Porsche Monty said:

You realize how inefficiently shaders perform on legacy hardware (e.g. GeForce 6), do you?


Then you can use the fallback rendering methods.

EDIT: That is, if the performance hit is anything more than negligible, which apparently even a Geforce 6 series can handle well enough.

Share this post


Link to post
AlexMax said:

Why do you care what is driving the display as long as the end result looks the same? Using OpenGL does not automatically mean you're on a one way trip to looking like GZDoom.


It would have to be pixel-perfect and at least as fast as it is now on my system, plus screenshots would have to remain paletted pcx's. No functionality or compatibility compromises whatsoever. If that can be dealt with, then I wouldn't care.

Share this post


Link to post

Depends on where the screenshot is grabbed from. As long as it's from the software buffer, of course it's still paletted. Unless you do some postprocessing after rendering it to the screen that's not a problem. (Using PCX though, that is a problem these days.)

Share this post


Link to post
Graf Zahl said:

Depends on where the screenshot is grabbed from. As long as it's from the software buffer, of course it's still paletted. Unless you do some postprocessing after rendering it to the screen that's not a problem. (Using PCX though, that is a problem these days.)


Why is it an issue? Couldn't it be converted to PCX for the grognards who don't like 8-bit PNG files for whatever reason??

Share this post


Link to post
AlexMax said:

Then you can use the fallback rendering methods.

EDIT: That is, if the performance hit is anything more than negligible, which apparently even a Geforce 6 series can handle well enough.


I can tell you that pretty much anything shader on a 6200 sends performance down the gutter, even a 6600 will suffer. Only the most expensive 6800's can handle it better.

Share this post


Link to post
Graf Zahl said:

Actually, it uses D3D and a shader to render the scene as a paletted 8 bit texture. DirectDraw is only used as a fallback on old hardware.

Ah ok. I knew it was something along those lines, its been a long time since I've looked at ZDoom's source. Thanks for the clarification.

Share this post


Link to post
DaniJ said:

What most people in the DOOM community actually want from 8bit colour mode (at least in a GL renderer) is a shared texture palette - which, is still supported in OpenGL through other means (giving a healthy video memory saving to boot)

Do you mean GL_EXT_paletted_texture? It does not work about 10 years. At all. GLBoom still has support for that, but it is not supported even on 8-10 years old hardware.

DaniJ said:

What most people in the DOOM community actually want from 8bit colour mode

8bit color mode still 1.5-2x faster than 32bit mode. So if you have 60 fps on sunder.wad in 8bit, you will have 40 in 32bit mode (software).

Share this post


Link to post

Actually GL_EXT_paletted_texture is properly supported by GeForce FX cards (barely 7 years old) allegedly through emulation. A driver older than the latest available for the card is still required, though.

Share this post


Link to post

That can't be right entryway, it would practically predate the Doomsday project entirely. I know for a fact it was working up to a couple of years ago as I had to re-implement support myself due to engine architecture changes.

Obviously there is the performance aspect to consider but that does not alter the validity of my assertion. Most users do not associate 8bit colour with performance.

Share this post


Link to post
DaniJ said:

That can't be right entryway, it would practically predate the Doomsday project entirely. I know for a fact it was working up to a couple of years ago as I had to re-implement support myself due to engine architecture changes.

It does not work on my 4 years old 8800GTS and it did not work on my previous Radeon 9800PRO. At least with drivers I used.

From opengl.org:
Selected NVIDIA GPUs: NV1x (GeForce 256, GeForce2, GeForce4 MX, GeForce4 Go, Quadro, Quadro2), NV2x (GeForce3, GeForce4 Ti, Quadro DCC, Quadro4 XGL), and NV3x (GeForce FX 5xxxx, Quadro FX 1000/2000/3000).

NV3 (Riva 128) and NV4 (TNT, TNT2) GPUs and NV4x GPUs do NOT support this functionality (no hardware support). Future NVIDIA GPU designs will no longer support paletted textures.

S3 ProSavage, Savage 2000. 3Dfx Voodoo3, Voodoo5. 3Dlabs GLINT.

Yeah, S3 ProSavage and 3Dfx Voodoo3 support that shit. Cool! GLBoom-Plus was born when it already did not work.

Share this post


Link to post

I don't understand how you can say "it doesn't work". Surely you mean it hasn't worked for you on the hardware you've tested it with.

EDIT:

Future NVIDIA GPU designs will no longer support paletted textures

Explicitly? Perhaps. However paletted texture functionality can be replicated using other features that won't be disappearing any time soon.

Share this post


Link to post
DaniJ said:

I don't understand how you can say "it doesn't work". Surely you mean it hasn't worked for you on the hardware you've tested it with.

Yes, it is not supported by hardware I used for latest 7 or 8 years. It worked only for 'short' period of GeForce4 MX like GPUs (quake3 era). Did not work before (TNT), does not work after.

Share this post


Link to post
DaniJ said:

EDIT:
Explicitly? Perhaps. However paletted texture functionality can be replicated using other features that won't be disappearing any time soon.



I think that's what the entire shader discussion was about. Anyway, doing it with shaders is fine if all you want to do is non-filtered screen blitting. For any real use the added work is just too much to even consider it. Any semi-recent graphics card has enough RAM to load the textures with full 32 bit.

Share this post


Link to post

It seems to me that GPU's supporting palletized textures really isn't relevant, it doesn't solve the core problem. And it's already been mentioned that expanding to 32 bit will give you the same results unless you're shading. That's the real issue anyway, how to do shading (varying light levels) using the palette and color map - so that the shading gradients are 8 bit and authentic. If you generate a 2d texture from your palette/color map you can then index it with the palette index from the texture and the light level to get the final 8 bit color value. This requires 1 dependent texture read per-pixel, from a low resolution texture. Modern cards can do this easily and even pixel shader 1 cards can do it reasonably fast at low to moderate resolutions.

Of course as soon as you mix in true color textures or use filtering things become more complicated, so it's understandable that many source ports don't worry about this (or maybe it's optional) - I only bring it up because some people seem to be barking up the wrong tree (explicit gpu support for palletized textures).

Share this post


Link to post

Thanks for the all the comments, guys. It definitely seems like it would be a good idea to add an 8in32 mode like Eternity has. It should then be straightforward to choose different defaults for Windows Vista/7.

Porsche Monty said:

You don't need a modern video card or modern anything to run legacy software, in fact, this glitch just goes to show that newer=worse, and many people acknowledge this fact, deliberately keeping their hardware "outdated" to retain compatibility/usability.

Porsche Monty said:

I don't get any of that. Must be your crap computer.

Comments like this really aren't helpful at all, and appear to miss the fact that the entire purpose of Chocolate Doom is to run on modern computers.

Share this post


Link to post

I've recently attempted to use the latest raven-branch win32 release to play Hexen, but I've encountered a serious issue that renders the port unplayable. Specifically, if I attempt to play in windib mode, I run into the annoying palette-delay bug that bothered so many users until SDL incorporated a fix. While the patch resolved the delay for me when using directx, it still occurs with windib. This wouldn't be a problem; however, directx mode also exhibits an issue of its own.

When playing in directx, the game suffers from severe stuttering and slowdown. This isn't exclusive to Chocolate Doom, though, as I've also encountered it in PrBoom-Plus. In that port, if I enable vsync while using the directx renderer with the screen multiply function, it also produces severe stuttering. But this is resolved if vsync is disabled. I would therefore assume that something directly related to the way these two SDL-based ports stretch the game image to the desired resolution conflicts with vsync. As far as I can tell, Chocolate Doom lacks any specific settings present in the cfgs to toggle vsync. Is vsync forced in Chocolate Doom when using the directx renderer, and has anyone else encountered this problem? Any help would be appreciated. I'm running XP32, by the way. Thanks!

Share this post


Link to post
lucius said:

It seems to me that GPU's supporting palletized textures really isn't relevant, it doesn't solve the core problem. And it's already been mentioned that expanding to 32 bit will give you the same results unless you're shading. That's the real issue anyway, how to do shading (varying light levels) using the palette and color map - so that the shading gradients are 8 bit and authentic. If you generate a 2d texture from your palette/color map you can then index it with the palette index from the texture and the light level to get the final 8 bit color value. This requires 1 dependent texture read per-pixel, from a low resolution texture. Modern cards can do this easily and even pixel shader 1 cards can do it reasonably fast at low to moderate resolutions.

Whoah, slow down there lucius. We weren't even discussing the modelling of DOOM's light attenuation on the GPU. That is why it seems irrelevant and the wrong tree - it is, if thats what you thought we were talking about.

Remember that most of the ports mentioned in this discussion have been around long before the pseudo-solution you mention was even an option.

Share this post


Link to post
DaniJ said:

Whoah, slow down there lucius. We weren't even discussing the modelling of DOOM's light attenuation on the GPU. That is why it seems irrelevant and the wrong tree - it is, if thats what you thought we were talking about.

Remember that most of the ports mentioned in this discussion have been around long before the pseudo-solution you mention was even an option.

Fair enough, I must have mis-interpreted some of the discussion then - my bad. Part of the issue, though, was seeing people discussing emulating palletized textures on the GPU - which seems pretty useless to me if not trying to model the light attenuation. I suppose you can save some memory but it just doesn't seem worth the cost of a dependent texture read just for that.

I guess the exception could be ZDoom, if the software renderer outputs color index values into a buffer and then the pixel shader is applied - but I got the impression that people were talking about using the emulation (or explicit) palletized textures as replacements for 32 bit textures in the general case, where it didn't seem like a good idea by itself.

Anyway, sorry for the misunderstanding. :)

Share this post


Link to post
Guest
This topic is now closed to further replies.
×