::sigh:: I hate to further this derailment, but it's breaking my heart to see some folks still praising "the emperor's new clothes".
I didn't want or expect Graf to take things this way, and I'm almost sorry I ever said anything at all, but someone had to. The way he bashed folks I expected him to take my bashing in stride, and perhaps try one of the options I mentioned for testing his code... (perhaps he did, and found out what I knew: BUGS!)
The fact is that no one else really had any experience with OpenGL coding, and couldn't stand up to Grafs statements (or didn't care to). The proof is in the pudding -- On many cards where other GL games run fine, GZDoom doesn't. Graf used his singular knowledge as leverage to "Argument from authority" and claim that ATI drivers / hardware suck (They most certainly ROCK just as hard if not more so than Nvidia / Intel), and that Doom is SO different from all other games that it can't be done any better than his way...
Seriously, do you think a company stays profitable SELLING top end graphics hardware that is utterly broken?
Even other games (like Spleen mentioned glboom+) which do things very similar to the way Graf's engine does things work fine on hardware that GZDoom has difficulty running on.
All I was really trying to do was to get Graf to admit that it's not really just ATI or Intel graphics hardware that is to blame for most of his bugs on those platforms.
Fact is: GZDoom wasn't tested against the OpenGL Sample Implementation (Reference Renderer) http://oss.sgi.com/projects/ogl-sample/
This software-only OpenGL driver lets you debug the guts of your OpenGL calls and find out why you're having errors. If your code works without throwing errors on the OGLS.I. then it's got to be a hardware issue... GZDoom has serious crashing issues when using the OGLS.I. renderer, so you can certainly NOT claim it's the hardware's fault -- The Debug Build of OGLS.I. + debugger tells you which line in your code has the errors!
Try it. Any GL developer is encouraged to have this available (since it's open source), and most graphics Hardware companies won't even talk to you about bugs unless you've got your code running on the OGLSI first. It's the second thing I fire up when I'm looking for my GL bugs (first is memcheck).
Now, I'm not saying that Graf had to test his code on this, but I'll be damed If I'm going to stand by while some one spreads FUD about ATI and Intel hardware just because he's not sure why his code is crashing in the first place, and won't take the time to find out! I've been lurking for years, but today I finally snapped.
No-one has to trust me. Go out and get some FREE testing tools and run them against GZDoom. If you do, you'll see some scary ass-shit that makes experienced coders run for the hills screaming, "NOoooo!"
I saw the hoards of evil memory munching bugs lurking just out of everyone else's sight, and I didn't run for the hills... I chose to join a team where I thought I could make the most difference. Give me a chance guys, I'm here to help, and I wont ever ragequit the community. (I took a position @ work so I could work 4 tens, and have a whole day off just for coding on ST! -- which I new would piss off my old boss, a dear friend: "There's Just something I need to do")
Also: When the new gal or guy @ work points out a bug in my code, I smile and buy them lunch as thanks!
This is why NO ONE should blame anyone for NOT working on it with him... I was willing to at one point, but I lurked in the forums for a few years, and I realized that he wasn't the type of dude I wanted to work with. Devs should help people, not belittle them and turn a blind eye -- Especially not when everyone is screaming about mysterious effects of a Bugosaurus Rex. That's like sticking your head in the beasts gaping maw and saying: "Well, it's not eating ME yet!"
(It's not so much the memory leaks, but using pointers to ram that's been freed, and using variables without initializing them first -- that's C 101, chapter 1 page 1, first paragraph: Using Variables!).
When people complain about something enough, even Microsoft will eventually own up and look into the issue...(Tada! Windows7 produced in record time!)
Graf shouldn't be re-labeling GZDoom to say only for Nvida, Its not even strictly following the OpenGL spec, OR the best practices of the 3D software industry... IMHO it should just bear the "Alph" or "Experimental" tag, use at your own risk. I know it hurts when someone calls your baby ugly, but that's no excuse for saying its shit doesn't stink.
And just so this shit isn't 100% off topic. The fastest port for wads depends on what wad it is, certain wads are for certain ports. Keep in mind that certain ports may run faster on your machine might run slower on others, and certain ports are still in Alpha stage...