Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Impie

gzdoom lock-ups

Recommended Posts

I'm really tired of GZDOOM freezing when I play mods. I don't know what's causing it, but I even tried the newest gzdoom (1.4.8) and the problems persist in the same areas. Maybe some of you have had the same issues with these mods, and maybe its the mods themselves. Or maybe it's a Windows 7 thing. I dunno.

So far GZDOOM freezes and has to be closed with ctrl-alt-del when:

- I use Doom Vercetti (seems unpredictable, but didn't use to freeze; only started doing it recently)
- I open the first door of map 2 in Unloved
- I open the little office door in City of the Damned 2 after dropping through the hole in the second floor room (the one that's accessible from the first hell hound area).

If anyone has experienced this and found a fix for it I'd be very grateful. I'm tired of only being able to play awesome mods for about eight seconds.

Share this post


Link to post

I can't remember how to find out. All I can find under graphics properties is "intel graphics media accelerator driver" and I dunno if that's it or something else. And googling for answers is stupidly unhelpful.

I hope I don't have to look in the control panel cos that stupid thing always freezes the windows browser. Apparently it does that to everyone else, too.

Share this post


Link to post
Impie said:

"intel graphics media accelerator driver"



Even though Intel's graphics drivers are infamous for there rather shitty OpenGL support, constant lockups sound more that there's something wrong with your hardware.

Share this post


Link to post
Graf Zahl said:

Even though Intel's graphics drivers are infamous for there rather shitty OpenGL support, constant lockups sound more that there's something wrong with your hardware.


Even if the lock-ups are consistent only with certain areas of certain mods?

Share this post


Link to post

GZDOOM froze on me again. Any suggestions for the best graphics card to use GZDOOM with? If I ever get the money for it I'll be replacing the piece of crap my machine came with.

Share this post


Link to post

For GZDoom, the safest best is something from Nvidia. Intel chipsets are crap and should be avoided; and while ATI chipsets are fine, their OpenGL drivers have often severe problems.

Share this post


Link to post

And unless you have an interest in modern games a GTX400 series card won't give any advantage over a GTX260 or 280. With cards that fast the engine is limited by the CPU only.

Share this post


Link to post

Even though Intel's graphics drivers are infamous for there rather shitty OpenGL support, constant lockups sound more that there's something wrong with your hardware.


Oh please. Are you still telling people this crap? The fact that GZDoom is unable to run with so many graphics cards has absolutely nothing to do with hardware errors, it is a result of your poor programming skills. My older computer which had an ATI chipset was able to run every single OpenGL game from Quake 2 to UT2K4 without any issue yet every time I tried to run GZDoom I'd get kernel panics. And furthermore if it was the hardware that was the problem, then the OP would have this same issue with a lot of applications.

Even if the lock-ups are consistent only with certain areas of certain mods?


Yet more evidence that the problem is an error in the code and not an issue with the hardware ...

Share this post


Link to post
Madgunner said:

Oh please. Are you still telling people this crap? The fact that GZDoom is unable to run with so many graphics cards has absolutely nothing to do with hardware errors, it is a result of your poor programming skills. My older computer which had an ATI chipset was able to run every single OpenGL game from Quake 2 to UT2K4 without any issue yet every time I tried to run GZDoom I'd get kernel panics. And furthermore if it was the hardware that was the problem, then the OP would have this same issue with a lot of applications.



Yet more evidence that the problem is an error in the code and not an issue with the hardware ...


My ATI card works fine with GZDoom, did you get error logs or file bug reports for the errors?

And I have seen problems with Intel cards, they just aren't optimized for much. Especially not demanding games or GZDoom mapsets.

Share this post


Link to post

Yes I did file many errors however they were all dismissed by Graf and he continually insisted that I was doing something wrong. (He did the exact same thing with the bug reports that I filed, only with a more condescending attitude.) But yeah this happened many years ago, and I eventually quit trying to help improve GZDoom as I got tired of the aforementioned condescending attitude.

And yes you are right about Intel drivers sucking and being poorly optimized, however this would not result in the program locking up during specific places in mods every single time. The fact of the matter is that GZDoom is notorious for having kernel panics and other errors with a wide variety of graphics cards, and instead of trying to investigate and fix the problem, Graf simply dismisses the issues and blames it on the graphics card or the drivers as exemplified in this very thread.

Share this post


Link to post
phi108 said:

did you get error logs or file bug reports for the errors?


i did not in fact. it freezes and I have to boot out of the program with ctrl-alt-del, so I get no error reports that I'm aware of. i'd like to think it has to do with the mod coding, but one of the mods that freezes is Unloved, and if that were the mod's fault EVERYONE who played Unloved would be reporting it. So there's something up with my card, or with Gzdoom (again, i think more people would report this issue if that was the case), or both somehow.

Share this post


Link to post

i did not in fact. it freezes and I have to boot out of the program with ctrl-alt-del, so I get no error reports that I'm aware of. i'd like to think it has to do with the mod coding, but one of the mods that freezes is Unloved, and if that were the mod's fault EVERYONE who played Unloved would be reporting it. So there's something up with my card, or with Gzdoom (again, i think more people would report this issue if that was the case), or both somehow.


I can say with certainty that it isn't the mod. If the mod had programming errors it would either produce weird behaivor or simply crash GZDoom; it wouldn't cause the program to lock up. The most likely culprit is buggy programming in GZDoom as it has issues with a wide variety of graphics cards. You can try updating your driver, but I highly doubt it'll make a difference as its highly unlikely to be problem unless you get freezes and crashes with other Open GL applications.

Share this post


Link to post

My GF6200 is older than GZDoom itself and I have no problems whatsoever besides poor performance, and that's because this particular card has less-than-optimal shaders support and unfortunately GZDoom does everything with shaders.

Share this post


Link to post
Madgunner said:

I can say with certainty that it isn't the mod. If the mod had programming errors it would either produce weird behaivor or simply crash GZDoom; it wouldn't cause the program to lock up.

Not entirely true; there's a such thing as a runaway loop - if something gets stuck in a loop with no delay whatsoever between instances of said loop, the entire program grinds to a halt. So it could be the mod's fault in certain specific circumstances.

Share this post


Link to post
Porsche Monty said:

and unfortunately GZDoom does everything with shaders.



not on that card, it doesn't. Shaders for everything is only active on cards supporting Shader Model 4, which in NVidia terms translates to GF8xxx and newer.

The card is slow because it is slow. I still got an old computer with a GF 6800 and it also has major performance issues with larger maps.

And the 6200 is certainly slower than that one.

Madgunner said:

Oh please. Are you still telling people this crap? The fact that GZDoom is unable to run with so many graphics cards has absolutely nothing to do with hardware errors, it is a result of your poor programming skills.



I'll be so frank and call you a clueless idiot then because it's quite obvious that you have no idea what's going on.

Madgunner said:

My older computer which had an ATI chipset was able to run every single OpenGL game from Quake 2 to UT2K4 without any issue yet every time I tried to run GZDoom I'd get kernel panics.



I can't say much about UT2K4 (is it really OpenGL or only D3D?) but Quake 2 is certainly far less demanding of the graphics driver than GZDoom. (And no, GPU load is not equivalent to use of graphics features!)

Madgunner said:

And furthermore if it was the hardware that was the problem, then the OP would have this same issue with a lot of applications.



Not necessarily. Hardware lockups are mostly caused by putting too much load on the hardware, be it CPU or GPU.

But why do I even try to tell it to a smartass like you? You already seem to have made up your mind anyway.

Madgunner said:

Yet more evidence that the problem is an error in the code and not an issue with the hardware ...



Yet more evidence that you should be ignored.

Bottom line: If a significant percentage of users had such problems I'd get far, far more such reports than I actually get.
But since there's only a handful a year I'd rather speculate that something in these systems is not working properly, especially if you get as you put it 'kernel panics'. They are always symptoms of something more serious. A regular application doesn't even have the privilege to cause such issues, unless it uses a driver that does. Drivers are the only code going low enough in the system that can do it.

So before throwing insults at other programmers, better clean up in front of your own door first, pal! GZDoom uses a lot of components of your hardware. Instead of making blanket statements of bad programming you better try to narrow down what's causing your problems. Is it the graphics hardware, maybe even the sound chip or what? There's enough things in GZDoom you can change to lower the hardware load and see if any of those has an effect.

Share this post


Link to post

Not entirely true; there's a such thing as a runaway loop - if something gets stuck in a loop with no delay whatsoever between instances of said loop, the entire program grinds to a halt. So it could be the mod's fault in certain specific circumstances.


Ah yes, I stand corrected. Completely forgot about that :). However according to the OP this isn't the issue as the mods in question work for other people without issue.

I can't say much about UT2K4 (is it really OpenGL or only D3D?) but Quake 2 is certainly far less demanding of the graphics driver than GZDoom. (And no, GPU load is not equivalent to use of graphics features!)


Graf using your logic, if UT2K4 is far less demanding for the graphics driver than GZDoom, then what does that say about your port's optimization?

Graf Zahl said:

Yet more evidence that you should be ignored.

Bottom line: If a significant percentage of users had such problems I'd get far, far more such reports than I actually get.


Gee I wonder why? I quit filling out bug reports for GZDoom many years ago after having them all dismissed and having to deal with your condescending attitude, and I'm sure I'm not the only one to do this.

But since there's only a handful a year I'd rather speculate that something in these systems is not working properly, especially if you get as you put it 'kernel panics'. They are always symptoms of something more serious. A regular application doesn't even have the privilege to cause such issues, unless it uses a driver that does. Drivers are the only code going low enough in the system that can do it.


Ok Graf continue to deny everything and blame it on the drivers, this is what you almost always do whenever someone points out something wrong with GZDoom. The fact of the matter is that GZDoom has issues running with a wide variety of graphics cards. In fact not too long ago it didn't run with virtually any ATI cards at all. This should clearly say that there is something wrong with GZDoom and not with the graphics card itself. By the way as a little hint for you, perhaps you should release dynamically allocated arrays correctly by using delete[] rather than delete.

Share this post


Link to post
Madgunner said:

Ok Graf continue to deny everything and blame it on the drivers, this is what you almost always do whenever someone points out something wrong with GZDoom. The fact of the matter is that GZDoom has issues running with a wide variety of graphics cards.



Do you have any proof? If so, give it to me. Strangely enough people like you seem to generalize from a handful of problem reports. If things were as bad as you made them out to be this and other forums would be flooded with problem reports. Where are they?

Madgunner said:

In fact not too long ago it didn't run with virtually any ATI cards at all. This should clearly say that there is something wrong with GZDoom and not with the graphics card itself.


Or maybe it's just proof that there was a bug in ATI's drivers that broke code with absolutely no relation to graphics and required some absolutely nonsensical workaround to get fixed?

Madgunner said:

By the way as a little hint for you, perhaps you should release dynamically allocated arrays correctly by using delete[] rather than delete.


I somehow remember that statement. Are you by any coincidence related to that jerk VortexCortex over at Skulltag? He had the same offensive attitude with the exact same 'issues'.

Share this post


Link to post

If it's worth saying, my old (~2003) computer never had any trouble running GZDoom, even considering its ancientness and ATI card.

Share this post


Link to post

Unless you run Win98 or older, an application running in user space like GZDoom simply cannot provoke system crash, BSOD or kernel panic. It is simply impossible. The worst it can do is either becoming unresponsive (and you can kill it by using Ctrl-Alt-Del to summon the task manager), crash to desktop abruptly (with or without a "this application ceased functioning" dialog box from the OS that wonders what happened), or, in the best case, crash cleanly (with the full-fledged crash report window).

The basic reason is the separation between user space and kernel space. If code or data becomes aberrant in user space, the OS can just terminate the concerned application. But if the error happens in kernel space, then there is no guarantee that it can recover safely, so the system panics to avoid corruption.

If the system crashes, it cannot be directly because of problems in an application. The crash will be induced by faults in the hardware, the kernel, or the driver. This may be triggered only by a specific application, but then it'll just mean that this application is the only one that uses the glitched component (hardware, kernel or driver).

Share this post


Link to post

Unless you run Win98 or older, an application running in user space like GZDoom simply cannot provoke system crash, BSOD or kernel panic. It is simply impossible. The worst it can do is either becoming unresponsive (and you can kill it by using Ctrl-Alt-Del to summon the task manager), crash to desktop abruptly (with or without a "this application ceased functioning" dialog box from the OS that wonders what happened), or, in the best case, crash cleanly (with the full-fledged crash report window).


You are correct and I should have been a little more clearer. The application itself cannot cause a BSOD, however if an application uses OpenGL or DirectX incorrectly it can cause the drivers to crash resulting in a BSOD. Some drivers may provide protection against this (most likely the case with most Nvidia drivers), but not all of them do.

I somehow remember that statement. Are you by any coincidence related to that jerk VortexCortex over at Skulltag? He had the same offensive attitude with the exact same 'issues'.


Looks like I'm not the only person who recognizes issues with your code. Too bad you let your ego get the best of you and almost never listen to anyone's suggestions.

Do you have any proof? If so, give it to me. Strangely enough people like you seem to generalize from a handful of problem reports. If things were as bad as you made them out to be this and other forums would be flooded with problem reports. Where are they?



You're right, I don't have concrete proof. I'd have to go through your entire code and run numerous tests to get this, which I do not feel like doing due to numerous prior experiences I've had with you. However there are many likely culprits. For example, I've seen a couple of places where you call certain OpenGL extensions without first checking if the graphics card supports it. This definitely will cause issues and could be the reason for the BSODs on certain graphics cards.

By the way, OP I suggest trying to run these mods in Vavoom. I haven't ever used it before, but it claims to be 100% compatible with GZDoom maps and perhaps it's more optimized than GZDoom.

Edit: Here's another hint for you. Quit using OpenGL's ancient immediate mode, it's no longer 1996, get with the times. That mode was deprecated a long ass time ago by vertex arrays (1997 I believe.) which has since been replaced with vertex buffer objects. Vertex buffer objects are WAY faster than immediate mode and only coding noobs still use OpenGL's ancient immediate mode for their applications in the 21st century.

Share this post


Link to post

What's funny about this whole thread is that it has turned out into a bitter flamewar, while all the time the OP hasn't specified whether:

  • He has had similar problems with other OpenGL or Direct3D games (or even if he plays anything else at all) or if everything else runs flawlessly.
  • Can run something equally demanding in terms of shaders, effects etc. Being able to run 3DMark05 or even '03 would be a good indication.
  • He has ran a memory test. Intel GMA usually means we're talking about a laptop or a cheap mobo, and the video memory is usually shared with main RAM, with the actual amount available being another sore thumb.
  • What version of Intel GMA hardware and drivers he's using. For all we know, it could be a lowly 915 or the latest GMA HD, and those have
    wildly variable OpenGL support (from almost none to "3.0" on paper).
  • How much video memory is available to GZDoom. Intel GMA tend to have this crappy "dynamic memory management system" which allocates video ram dynamically, but often fails miserably, and on most BIOSes you can't fix it to a specific amount. Some games won't start at all, and some may crash mid-game or after a specific action or effect is triggered as a result of insufficient memory.
For all its crappiness, at least Intel GMA has one thing going for it: it has so few pipelines and hardware that it's about as powerful as an overclocked S3 from 1996, and most stuff is done "nice and slow" in software, including most OpenGL rendering (at least in older GMA chipsets). It would indicate a serious hw/sw fuckup if THAT manages to crash during to hardware stress/overheating.

So if you get crashes, it might just be a particular hardware function or memory location being activated.

Share this post


Link to post

Madgunner at least does his name justice, he appears to be quite mad and incoherent, blabbering a lot of nonsense - and I have no more desire to counter this unfounded shit.

I still see no proof for all these accusations, just some unbalanced hothead who apparently can't stomach that his own computer is failing him.

So:

If you have concrete issues to report (both hardware lockups and errors in the source), there's a forum for that (This goes especially for coding errors you noticed. I see no point in hunting them down again when someone else already had.) BTW, you are in good company with your attitude. The few other people who spilled similar accusations followed the same pattern: Rude insults in some discussion thread but not the slightest hint of willingness to report known issues.

Share this post


Link to post

If you have concrete issues to report (both hardware lockups and errors in the source), there's a forum for that (This goes especially for coding errors you noticed. I see no point in hunting them down again when someone else already had.) BTW, you are in good company with your attitude. The few other people who spilled similar accusations followed the same pattern: Rude insults in some discussion thread but not the slightest hint of willingness to report known issues.


Actually no, I have no desire to help you any more. I reported many bugs and errors years ago and you continuously ignored them, insisted I was doing something wrong, or just plain snapped at me, and as you've demonstrated by this thread you haven't changed at all. I'm more than willing to help people out, however I refuse to help someone out who has such a terrible attitude. Looking back at all of this, I shouldn't have even given you the suggestions I posted on this forum.

Anyways I'm not going to post on this thread again. I'll just let the "OpenGL programming god" Graf handle the issue with the original poster.

I somehow remember that statement. Are you by any coincidence related to that jerk VortexCortex over at Skulltag? He had the same offensive attitude with the exact same 'issues'.


Yet more proof that you don't listen to people's advice and suggestions. He was even kind enough to give you examples yet you still did nothing and raged.

Share this post


Link to post
Madgunner said:

You are correct and I should have been a little more clearer. The application itself cannot cause a BSOD, however if an application uses OpenGL or DirectX incorrectly it can cause the drivers to crash resulting in a BSOD. Some drivers may provide protection against this (most likely the case with most Nvidia drivers), but not all of them do.

Personally, I'd consider it a fault of the drivers if they can cause crash for any reason. They should perform sanity checks on all potentially dangerous operations they perform, because, well, they run in kernel space.

Madgunner said:

By the way, OP I suggest trying to run these mods in Vavoom. I haven't ever used it before, but it claims to be 100% compatible with GZDoom maps and perhaps it's more optimized than GZDoom.

My own experience with Vavoom was that it was way more instable than GZDoom. As for its level of compatibility, it is steadfastly decreasing given the relentless update rate of ZDoom and GZDoom. Should work with relatively old maps, though.

Madgunner said:

Edit: Here's another hint for you. Quit using OpenGL's ancient immediate mode, it's no longer 1996, get with the times. That mode was deprecated a long ass time ago by vertex arrays (1997 I believe.) which has since been replaced with vertex buffer objects. Vertex buffer objects are WAY faster than immediate mode and only coding noobs still use OpenGL's ancient immediate mode for their applications in the 21st century.

GZDoom basically uses the same principles as GLBoom+ for its OpenGL architecture (relying mostly on immediate mode). GLBoom+ is the fastest OpenGL Doom port available, and GZDoom the second fastest.

And GZDoom does use VBO for flats (though it's optional). The performance improvement isn't really noticeable, though. There has been a renderer rewrite which was going to use VBOs for everything, and the disappointing results it got meant that it was scrapped and immediate mode retained. Can read some old stuff about it here. The big problem with using VBOs for Doom is that Doom isn't a modern game, and VBOs are tailored to the needs of modern games. The low complexity of most Doom levels reduces greatly the gains from using VBOs instead of immediate mode; and the way Doom manages its level geometry adds a lot of overhead to VBOs when you need to update vertices because a sector or a polyobject is moving. It is no longer 1996, sure, but Doom was developed in 1993 after all.

Share this post


Link to post
Graf Zahl said:

not on that card, it doesn't. Shaders for everything is only active on cards supporting Shader Model 4, which in NVidia terms translates to GF8xxx and newer.

The card is slow because it is slow. I still got an old computer with a GF 6800 and it also has major performance issues with larger maps.


That's not the whole story. I'm not talking about performance issues with large maps (which I seldom ever experience under my rather conservative config) I'm talking about special effects in regular vanilla-sized maps causing slowdowns.

Brightmaps, glowing flats and other shader-only stuff do kill the FPS big time, and yes, I can enable these on my GF6. Anyways, I normally run GZDoom with all the fanciness turned off so it's not really a problem.

Share this post


Link to post
Madgunner said:

Actually no, I have no desire to help you any more. I reported many bugs and errors years ago and you continuously ignored them, insisted I was doing something wrong, or just plain snapped at me, and as you've demonstrated by this thread you haven't changed at all. I'm more than willing to help people out, however I refuse to help someone out who has such a terrible attitude. Looking back at all of this, I shouldn't have even given you the suggestions I posted on this forum.



I repeat myself: Proof! Since your username can not be found anywhere on DRDTeam I have no way to verify that statement and must assume that you are a liar unless you can point me to something I supposedly ignored.


Porsche Monty said:

Brightmaps, glowing flats and other shader-only stuff do kill the FPS big time, and yes, I can enable these on my GF6. Anyways, I normally run GZDoom with all the fanciness turned off so it's not really a problem.



Not really surprising. But that's why these features are off by default and optional on such old cards. Their shaders are just not fast enough.

Share this post


Link to post
Gez said:

The big problem with using VBOs for Doom is that Doom isn't a modern game, and VBOs are tailored to the needs of modern games. The low complexity of most Doom levels reduces greatly the gains from using VBOs instead of immediate mode; and the way Doom manages its level geometry adds a lot of overhead to VBOs when you need to update vertices because a sector or a polyobject is moving. It is no longer 1996, sure, but Doom was developed in 1993 after all.


To elaborate on this, VBOs are good if you have to do draw calls with lots and lots of vertices in them. Doom isn't like that. The average wall is 4-10 vertices, the average subsector is 3-10. Sure, subsectors belonging to the same sector can be combined but the main problem remains:

The vertex count per primitive is too low to make up for the overhead in using VBOs. The end result of the renderer rewrite was that when I put all necessary attributes into the VBO the whole thing was 10-20% slower than using immediate mode and when pulling the static attributes out of the VBO and setting them as uniforms provided approximately the same speed as immediate mode. It became quite obvious that the bottleneck is not feeding the geometry data to GL via immediate mode but instead internally processing it.

And that was for flats only without any checks for sector movement taken into account!

Wall rendering with VBOs would be even more messy and incur a lot more overhead with even less gains due to the way walls need to be set up so I didn't even start with that.

The current VBO code provides a very mild speedup on NVidia and a noticable slowdown on ATI, btw. Apparently ATI can't handle mixing VBOs and immediate mode well. Pity.

Now, where VBOs shine if you want to render a large number of primitives with a single draw call and there isn't anything that would beat them there. That's why voxel models use VBOs. Here they are about 5-10x faster than using immediate mode but this is also a perfect case for using them because the average cubified voxel model will have hundreds of primitives and vertices.

Share this post


Link to post

How about transforming texels into VBOs? That should crank the texel count up ;-)

Or at least force-splitting whole textures into subpartitions (e.g. instead of 1 128x128 pixel texture, split that into 4 64x64 ones, while boosting the vertex count up. It's counterintuitive, but could it lead to making using VBOs worthwhile?

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×