Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
ultdoomer

ZDoom vs. GZDoom

Recommended Posts

Does the wraparound sky not work in Legacy? What happens if you set that in the FS header?

Share this post


Link to post
Graf Zahl said:

The original, definitely. However, I have almost completely rewritten that part so the version in GZDoom doesn't have any memory management issues anymore.

Good to know.

To me the biggest issue with the existing code base is that expression evaluation does not know real operator precedence but I fear if I fixed that some Legacy WADs wouldn't run anymore.

It should do operator precedence. The order of list in t_oper.c should determine the precedence ordering?

Share this post


Link to post
Graf Zahl said:

Does the wraparound sky not work in Legacy? What happens if you set that in the FS header?

It works in Hardware render mode. But not in Software. Gives tutti frutti effects.

Share this post


Link to post
kristus said:

It works in Hardware render mode. But not in Software. Gives tutti frutti effects.


Well, normally I'd say 'screw this piece of sh*t' and focus on the ports that are popular and widely used instead of the most bug-ridden infestation with a diminishing user base - but obviously it's not my decision.

However, since I don't want GZDoom to be compromised I added a new key 'ignore' to the FS level info parser. Put 'ignore = 1' in the first line of your level info and all upcoming GZDoom versions won't read its contents.

Share this post


Link to post

Another option could be to redefine SKY4 in a TEXTURES lump. Legacy would ignore that and presumably it would override the TEXTUREx entry.

Share this post


Link to post
Graf Zahl said:

However, since I don't want GZDoom to be compromised I added a new key 'ignore' to the FS level info parser. Put 'ignore = 1' in the first line of your level info and all upcoming GZDoom versions won't read its contents.

Thank you. Much appreciated. Besides. You forget that Remood also use Level info. As do Eternity actually, but TBH I hope Quasar will ditch that since he already ditched FS anyway.

Share this post


Link to post
Juiche said:

I hate GZDooM graphics. Best is skulltag

Knowing where you come from, I think it's safe to say that we should ignore this post.

I like GZDoom because it's up to date while providing opengl features, however, with recent versions I've noticed some agonizing lag in some parts where it has never lagged before. I find it strange..But I put up with it.

Share this post


Link to post
Juiche said:

I hate GZDooM graphics. Best is skulltag

Very amusing, considering that Skulltag's OpenGL renderer is taken from GZDoom... The only difference in looks is that Skulltag autoloads the brightmaps and dynamic light definitions, by embedding them in its main pk3, while they're optional in GZDoom.

Share this post


Link to post
Breadrobber said:

I just use zDoom because it doesn't have all those annoying lighting effects =6.

What do you mean by that? It can't be the dynamic lights, because it's an OPTION that is OPTIONAL and OFF BY DEFAULT. To see them, you have to 1. enable dynamic lights 2. load dynlight definitions, something which isn't done by default.

So since it's not that... What is it?

Share this post


Link to post
Breadrobber said:

What does the "G" in "GZDoom" stand for anyways?

I assumed it meant GL ZDoom, but then I though a little harder and decided it probably meant Graf Zahl Doom.

Share this post


Link to post
Csonicgo said:

And there soon won't be any shaders, unless you buy the latest and greatest Grafixxx cards! :(



If you insist on staying behind by 3 or more generations you really don't deserve better. Shall I limit the feature set just for the few stubborn individuals who refuse to upgrade?

Or said differently, the current code base has become close to unmaintainable due to the vastly different hardware it has to support. The result is various bugs in the shader code that are hard to find and even harder to fix. Sorry, pal, but starting with a clean slate is the only option I see to move on. 'Clean slate' meaning in this case to concentrate on the hardware which can do what I need to develop the rendering code without fallback cases for older cards that only have limited shader support. I don't want to mess up the code again.

Share this post


Link to post
Graf Zahl said:

If you insist on staying behind by 3 or more generations you really don't deserve better. Shall I limit the feature set just for the few stubborn individuals who refuse to upgrade?

Or said differently, the current code base has become close to unmaintainable due to the vastly different hardware it has to support. The result is various bugs in the shader code that are hard to find and even harder to fix. Sorry, pal, but starting with a clean slate is the only option I see to move on. 'Clean slate' meaning in this case to concentrate on the hardware which can do what I need to develop the rendering code without fallback cases for older cards that only have limited shader support. I don't want to mess up the code again.

The amount of money to upgrade a video card to something with decent shader support is trivial. Any video card that you would want to be using to play current gen games is more than enough I would imagine. If somebody's hardware is too much older than that, then I don't imagine they really care about newer games or fancy hardware accelerated effects in doom anyway.

By the way, Graf Zahl, for this major overhaul you are planning, what kind of requirements are needed? Shader model 2.0 or greater or something along those lines?

Share this post


Link to post

Something that was posted on the drdteam forums, but I'm quoting it because with the move they might be down for some time:

That's where the old renderer comes in. It will continue to work on older cards.

But I absolutely see no point in investing time in something that has to compromise all the way just to work on old hardware. The new renderer is meant to better exploit the capabilities of modern hardware. It uses shaders for everything and completely circumvents the old fixed function pipeline which is the root of all the problems that have crept into the rendering code. The new renderer uses a much leaner system interface because it doesn't have to bother with all the cruft that is needed to render both with shaders and with the hard coded functionality of old cards. This alone should make it faster.

On the other hand, if I had to code everything with compatibility fallbacks in mind it'd go nowhere. I rather keep the old code around, stripped off all shader support so that the end result will essentially 2 rendering paths - the current one for old hardware and the new one for new hardware.

So the system requirements are:

- OpenGL 2.1
- full GLSL support
- full vertex buffer and texture buffer object support.

If I wouldn't set these minimum requirements the entire rewrite would be an exercise in pointlessness.

That means Geforce 8xxx series or better and modern ATI cards only.
Everything else will fall back to the existing rendering code - but will also obviously miss out on future enhancements.

Share this post


Link to post
Mike.Reiner said:

By the way, Graf Zahl, for this major overhaul you are planning, what kind of requirements are needed? Shader model 2.0 or greater or something along those lines?



I can't really define it in shader models but SM2 will most certainly be not even close. The cards I am aiming at all have SM4 at least.

With NVidia it will definitely be Geforce 8xxx and up. The older cards have too major deficiencies in their pixel shader performance that will make the shaders I have written perform very badly. I could only do something here if I made a mess of a large quantity of micro-shaders optimized for one special case but that's precisely the one thing I don't want to do as it was what makes the shader support in the current renderer so problematic.

For ATI I can't say for sure what the low end will be. I'm certain that the Radeon 2000 series will work fine though. But it will require recent drivers for some features on ATI because they only implemented one extension I plan to use (texture buffer objects) in one of the most recent Catalyst versions.

Share this post


Link to post

The amount of money to upgrade a video card to something with decent shader support is trivial. Any video card that you would want to be using to play current gen games is more than enough I would imagine. If somebody's hardware is too much older than that, then I don't imagine they really care about newer games or fancy hardware accelerated effects in doom anyway.


I can't think of many modern games that can't run at 60fps on a 7800GT, and that card was released, what, four years ago ?

Upgrading still made sense about one or two years ago if you specifically wanted to run the latest games on highest graphic settings and/or on a super high resolution ; with the current crossplatform gaming environment, it's pointless to upgrade, as the vast majority of games are developed for consoles released 3-4 years ago.

If anything, the trend is to go for cheaper CPUs/GPUs with better energy efficiency.

Don't get me wrong, I'm all for what Graf Zahl is planning to do there, and I have 2 rigs with GeForce 8XXX so I'll get to see the improvements too. His port, his choices.

It's just the "if you're more than 3 generations behind you're obviously on outdated hardware and need to upgrade" line of reasoning is completely stupid when that "outdated" hardware can run 99% of the games on the market flawlessly and that same "outdated" hardware is also much better than the current console generation.

Again, the plan is fine, but it's just stupid to wrap it up with bullshit excuses. It's not like there's a need to justify these choices.

Share this post


Link to post
Phml said:

Again, the plan is fine, but it's just stupid to wrap it up with bullshit excuses. It's not like there's a need to justify these choices.



The 'bullshit excuse' is that the cards can't do what I want to do. It's either limiting the new renderer to newer cards or not doing it at all - because then I'd end up with the same mess I already have - quite pointless if you ask me.

FYI, the shader support in the current renderer has serious issues in certain situations - and I don't know how to fix them. The code has to deal with so many special cases that it's impossible to see where things go wrong.

Share this post


Link to post

Well, I for one can't wait to not be able to play any new GZDoom mods on my 2007 laptop that runs Doom 3, Half-Life 2, and Portal flawlessly. Bring it on.

Share this post


Link to post

Those newer games were built with new graphics cards in mind and, if I read what was quoted above properly, you could play the new mods anyway, just not with all the eye-candy.

DOOM might be old but it doesn't always transfer too efficiently to newer tech features, due to a structure that isn't optimized for them. That's why people like Carmack write their new engines pretty much from scratch. Various modern games also have smoother online connectivity than online Doom source ports. Similarly, DOS games have pretty high system requirements these days because something like DOSBox is required to run them.

DOOM might be old, but that doesn't mean community developers need to refrain from playing with state-of-the-art tech features to enhance it in ways that would not be possible or really effective otherwise.

Share this post


Link to post
esselfortium said:

Well, I for one can't wait to not be able to play any new GZDoom mods on my 2007 laptop that runs Doom 3, Half-Life 2, and Portal flawlessly. Bring it on.



Sigh...

Why do people always assume the worst? The old renderer will continue to be there - albeit stripped of the shader code which I can't fix anyway. So yes, older systems will miss out on some enhanced features - but your system wouldn't be able to handle them anyway so in the end you won't miss much.

Renderer rewrite or not - I will not invest any time in shaders optimized for old hardware anymore. The code just gets too messy too quickly.

Share this post


Link to post

I'm aware you're planning to keep the old renderer, but when its capabilities are adopted by new mods I won't be able to play them, hence my post. I'm sorry, but it just makes no sense at all that I won't be able to play new GZDoom mods on a computer that can run more powerful 3D engines with no problems whatsoever.

Share this post


Link to post
Graf Zahl said:

For ATI I can't say for sure what the low end will be. I'm certain that the Radeon 2000 series will work fine though. But it will require recent drivers for some features on ATI because they only implemented one extension I plan to use (texture buffer objects) in one of the most recent Catalyst versions.


Oh man. That is absolutely pathetic requirements you're setting there. I'm already barely able to play Gzdoom on my Radeon x1300, and I can't imagine what it'll be like when you remake the render. You need to optimize the engine, seriously. I'm able to play both Doom 3 and Far Cry at 1024x768, with everything on high and get between 20 and 40 frames per second. I'm not buying another video card just so that I'd be able to play one program.

Share this post


Link to post
esselfortium said:

I'm aware you're planning to keep the old renderer, but when its capabilities are adopted by new mods I won't be able to play them, hence my post. I'm sorry, but it just makes no sense at all that I won't be able to play new GZDoom mods on a computer that can run more powerful 3D engines with no problems whatsoever.



So what? Even if I added the new features to the current renderer you wouldn't be able to use them. The shaders still wouldn't run fast enough! The problem is not that I'm using actual features that don't exist on your hardware. It's because the older cards (especially NVidia up to Geforce 7xxx) have major performance issues with some shader constructs using conditional expressions in pixel shaders. I was able to test this on a GF6800 and the way I have to mutilate the shader code to make it work semi-decently on it just makes any shader based feature unattractive for me to implement.

Whoo said:

Oh man. That is absolutely pathetic requirements you're setting there. I'm already barely able to play Gzdoom on my Radeon x1300, and I can't imagine what it'll be like when you remake the render.


If you can't play decently on that hardware I'd say you have a serious problem somewhere. It should be more than enough to handle GZDoom - unless, of course, you are switching on all features meant for more modern cards. The renderer in its basic form was developed on a GF3 Ti and still runs fine on that (albeit a bit slow compared to what my GF8600 can do.)


You need to optimize the engine, seriously.


You need a reality check, seriously! Have you even read Myk's post above about Doom and hardware rendering? Doom will never *ever* be able to get the same performance out of modern hardware as newer games. Modern games optimize their data so that they can handle it with a relatively small overhead before passing it to the hardware. Doom can't do that. A lot of stuff has to be recomputed each single frame - and there's very little that can be done about it.


I'm able to play both Doom 3 and Far Cry at 1024x768, with everything on high and get between 20 and 40 frames per second. I'm not buying another video card just so that I'd be able to play one program.


Honestly, not my problem.

Share this post


Link to post
Graf Zahl said:

Doom will never *ever* be able to get the same performance out of modern hardware as newer games.


You might not be able to get it as opitimized, but you certainly could do a hell of a lot better. Take a look:

Column A is Gzdoom, Column B is GlBoom

Share this post


Link to post

What is this? Looks like random numbers to me.

If you want to make comparisons the least you can do is specify the exact settings you used. And even then, comparing 2 engines with different design goals is like comparing apples to oranges. GZDoom inevitably needs more overhead for processing due to ZDoom features.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×