Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Sign in to follow this  
Reaper978

OpenGL rendering = occasional screwiness

Recommended Posts

Every once in awhile when I am using a source port with OpenGL rendering, there will be some magic wall or floor somewhere that appears as if there was no texture on it - it is open to the "void" and it appears completely messed up. This kinda ticks me off because I know the particular wad would work fine in software mode (and I have seen this happen in multiple wads).

So... what is done about this? Nothing? Or am I missing something?

Share this post


Link to post

If this happens with different programs it's most likely a hardware or driver problem.

What's your graphics card and operating system?

Share this post


Link to post

Vista service pack 2
Mobile Intel 945 express chipset

I have DirectX 10 installed. Windows says I have the latest version of the video card drivers.

Well, at least I now know that this is a problem on my end and not the software. Thanks for letting me know.

Share this post


Link to post

I get strange GL artifacts sometimes with the same hardware under Linux. I'd describe mine as a flat which stretches and appears to get "sucked into" the vanishing point that the 3D perspective uses as a center.

I don't know the state of the Intel drivers under Windows, but the Intel developers have been doing alot of fundamental implementation changes under Linux over the last 6-12 months. The two may be related.

Try a different version of the graphics driver? Perhaps the one that originally came with your PC if it's over a year old or so.

Share this post


Link to post
Reaper978 said:

Mobile Intel

Say no more, the culprit is found.

Burn your laptop, then bury it in consecrated ground after cramming garlic cloves in its DVD drive, and buy a new computer untainted by Intel's graphic chipsets. That'll solve all your problems.

Share this post


Link to post

Say no more, the culprit is found.

Yeah, it isn't the greatest gaming system on the planet, that's for sure.

I think it has to do with the fact that the source port can't correctly do software-like effects in OpenGL.

I was thinking something like this may be the case. Here is an example:

In the "Stronghold" level on Evilution, down one of the initial hallways there is apparently supposed to be a "window" behind which two shotgun guys stand. It appears completely transparent in software mode, but in opengl it has this weird, foggy distortion effect (which is what the problem has been). I saw the same problem in the "Arachnophobia" map from Hell Revealed. After the initial lift, going forward and to the right reveals a large hall with the same problem.

Share this post


Link to post

GLBoom (and GLBoom-plus) break the pullover sky effect that makes skybases possible.

I can't tell whether this is a GL issue or not because it works in PrBoom but it works in GZDoom as well.

Does this happen to anyone else?

Share this post


Link to post

Mobile Intel is fine for GL ports of Doom. Hell, even Unreal Tournament 2004 on default settings runs surprisingly smooth.

heh

Share this post


Link to post

The latest mobile Intels are actually surprisingly powerful for what they are, and have none of the silly problems that plagued the first series, like offloading TnL to the CPU.

The X3100/965 is roughly equivalent to a GeForce 6200. AMD's latest integrated graphics in the RS880 are about the level of a 6800GT. Keep in mind these are both cards that Doom 3 was review-benchmarked against.

Share this post


Link to post

Not so for the 945 he's stuck with, though. That's pretty much on par with an S3, in terms of performance. The CPU and large amounts of (shared) video RAM may hide its inadequacy until a certain point (e.g. give it the ability to run OpenGL and Direct 3D games as if it was really a 3D accelerator) but glitches and shortcomings (usually in drivers alone, as TnL for OpenGL/Direct3D is entirely emulated in drivers) will soon take their toll.

Share this post


Link to post
Maes said:

That's pretty much on par with an S3, in terms of performance.

Oh rubbish.

A 945 gets similar framerate to an FX5600+ or Radeon 9600. The lack of hardware TnL or Vertex Shaders may hold it back on more modern games, but last time I checked, running a Doom port at 60fps wasn't exactly processor intensive by today's standards.

Driver issue yes, performance issue no.

Share this post


Link to post
Super Jamie said:

A 945 gets similar framerate to an FX5600+ or Radeon 9600.


On a 3DMark-like test? On processor parity? And vs any card with dedicated video RAM? Sorry, but I really need to call bullshit on that one. The fact that the CPU may mask many of its shortcomings doesn't mean it's suddenly an equal with dedicated graphics cards.

As an -anecdotal- proof, I post this benchmark from a blog:

http://komku.blogspot.com/2007/10/acer-aspire-4710-3dmark03-score.html

If you don't wish to read the whole story, the score was 1328 3DMarks on 3DMark03, with default settings. That is, on a laptop with a 533 FSB, a Dual Core CPU.

For comparison, my single-core Athlon64 3200+ with a Sapphire Radeon 9600XT with 256 MB scores just short of 4000 3DMarks on the same test on default settings, and my shitty Athlon XP 1400+ laptop with an integrated Radeon IGP320 (roughly equivalent to a Radeon 7000) scores about 600 3DMarks. That is, with a notable processor and FSB disparity, and on a laptop that's 8 years old.

So yeah...it may perform like a FX5600+ with a powerful CPU to back it up! Wake up people, this shit is entirely software/driver powered, there are no hardware shaders or pipelines.

Update: Actually, not even that. Here's a test performed on the same hardware at two different times, once with the integrated GMA 950, and once with an XFX Nvidia 6200 (that only has 32 MB of real video ram, the rest is shared).

http://www.extremetech.com/article2/0,2845,1821804,00.asp

Simply put, the GMA gets its ass filled with concrete except for CPU intensive games like flight sim 2004.

Share this post


Link to post

By comparison, my P4 overclocked to 3.13 GHz with an FX5200 scores 1008 in 3DMark03 default settings. And that was with Catalyst 9.3 or something, alot later than what was available in October 2007.

Laptop wins.

Share this post


Link to post
Super Jamie said:

By comparison, my P4 overclocked to 3.13 GHz with an FX5200 scores 1008 in 3DMark03 default settings. And that was with Catalyst 9.3


[sarcasm]Catalyst on a vintage nVidia card? No wonder it didn't work well [/sarcasm]

Laptop wins.


With an extra core to spare and to take the role that the GPU can't take, sure it can...by little.

But in the end, you get the "performance" of a 6 year, quite crippled lo-end graphics card only thanks to massive CPU overhead. That doesn't mean the GMA is better or has suddenly became a better graphics card overnight. It just means that in this case, polishing a turd is apparently possible. That is, until you step on it. ;-)

Hell, even the lowly FX5200 stands its ground vs GMA at less than half the CPU power. Remember, with GMA, 3DMark03 is no longer GPU-bound (actually, there's no real GPU) but CPU-bound, and even then, it barely manages to outperform a low-end graphics adapter from 6 years ago.

Anyway, to cut the bullshit factor:

http://en.wikipedia.org/wiki/Intel_GMA#GMA_950

It's clearly stated there's no geometry processing, no 3D hardware, no T&L, only MPEG/movie acceleration. Compared to an early S3 Savage it may have a faster fill rate thanks to higher clock speed (similar to how there were faster VGA cards with faster memory/interfaces even when "hardware acceleration" was unheard of on PCs) but nothing more. It's incomparable even to 2002-tech mobility radeons and the such.

Share this post


Link to post

Well around the time of Catalyst 9.3, as I upgraded at that time :P Still, the point is even more valid, major nVidia driver releases usually provide a decent performance jump. They certainly would over 2 years.

And not really for the rest. An ancient S3 won't support any good DX7/8/9 features, the best you can hope for is a good framerate in Quake 2 or something. The Intel will support them, just pass the operations off to CPU.

We obviously have differing opinions and none of this really helps the original poster. I still think you're giving the GMAs too much crap, they are more capable than you're making out. Maybe not for running Crysis, but for GL Doom ports they're more than enough.

Share this post


Link to post
Super Jamie said:

We obviously have differing opinions and none of this really helps the original poster. I still think you're giving the GMAs too much crap, they are more capable than you're making out. Maybe not for running Crysis, but for GL Doom ports they're more than enough.



I don't think so. They may be 'enough' to run a Doom port but they are still at the very bottom end of available hardware.

1.5 times the speed of a GF5200 - and that with a modern processor?

That's 1.5 times the speed of a really shitty product which still makes it shitty.

Share this post


Link to post
Super Jamie said:

Still, the point is even more valid, major nVidia driver releases usually provide a decent performance jump.

Super Jamie said:

And not really for the rest. An ancient S3 won't support any good DX7/8/9 features, the best you can hope for is a good framerate in Quake 2 or something. The Intel will support them, just pass the operations off to CPU.


Drivers will only actually bring benefit in the case of hardware that was there but underutilized, which is not the case for the GMA 950 and older.

The GMA is more or less the graphics card equivalent of a softmodem, so driver updates would only provide a performance boost if they were broken to begin with, and new features can be added by offloading even more work to the CPU.

E.g. my integrated Radeon IGP320 had no hardware T&L (which prevented it from playing some games), but a driver update "magically" added this capability... sure, it allowed me to play a few more shit than before instead of being presented with a "YOU CANNOT PLAY THAT U FUX0R!" error message, but that was about it.

Super Jamie said:

I still think you're giving the GMAs too much crap, they are more capable than you're making out. Maybe not for running Crysis, but for GL Doom ports they're more than enough.


Sure, nobody prevents you from running an OpenGL game with such a card, but it's really a bad idea, since you get no hardware acceleration at all: you're just using a software OpenGL renderer that's external to the application, instead of the built-in software one. Sure, the "soft-GL" renderer may be a bit faster than the built-in one (or noticeably slower) and you will have -sort of- effects like texture filtering, true color, dynamic lighting etc. but none of the benefits of running it on real hardware. It's just like using an OpenGL emulator.

Super Jamie said:

We obviously have differing opinions and none of this really helps
the original poster.


Listing actual technical specs of any piece of machinery/equipment is hardly a matter of opinion, although its usefulness in a context may be debated. IMHO, the GMA and other such "graphics cards" just give you a sort of "courtesy compatibility": they don't leave you entirely dry like a very old graphics card, but they are incomparable to any real hardware solution.

As for the original poster, he should realize that integrated graphics have their limits. Of course, as he probably can't change the graphics card since he's using a laptop, the only thing to try is increasing the amount of video RAM (many of those integrated chipsets default to some amount like 8 or 16 MB of shared memory, while even modest OpenGL and DirectX games typically require much more, at least 32 or 64 MB).

In some implementations you can statically change the amount of shared video memory from a setting in the BIOS, others however use a sort of "dynamic memory management" and there's no obvious way of changing or controlling it. Some games barf on such setups, others run with glitches.

Anyway, on the vein of helping the OP somehow:

Surprisingly, the amount of shared video RAM doesn't seem to be user-adjustable on most modern laptops....seriously, WTF is up with that.... in any case:

In this forum someone suggested changing the "driver memory footprint" from the "3D settings" or even needing a BIOS upgrade to do it explicitly. Else you may be stuck with dynamic management, that just barfs on some games. Also, some games may not even run with GMA video adapters, or run with glitches.

Share this post


Link to post

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
Sign in to follow this  
×