Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Sign in to follow this  
Betcha

OpenGl V DX 8.xx

Recommended Posts

Everyone knows that John Carmack likes to code in OpenGl as opposed to directx. The Quake series is proof of this and I assume that Doom3 will be OpenGL as well? The question is this;
With DX 8 offering many new improvements, can OpenGL still offer the superior feature set/performance? Short and sweet.

Share this post


Link to post

Well, obviously, JC has selected OpenGL as his standard. In fact, if you check his .plan file, you´ll find that most of his entries are recommendations (bah!, orders) for Nvidia, 3dfx and ATI new products. DX 8 and OpenGL have their own set of advantages/disadvantages, but i can honestly say that they perform almost equally. DX was designed to unificate the ton´s of drivers and functions that each video card presented (back when dinosaurs stormed the Earth and the Trident 1024 was the generic video card). DX analized your hardware, selected the instructions that could be performed by the card (HAL), and the rest of them was emulated by the CPU (HEL). Today´s DX8 is more an OpenGL adversary than a hardware unificator. It has it´s own set of instructions that probably only works on HAL in 15% of the video cards of the market, not to say less than the 1% of the total.
Anyway, we´re just talking about the renderer, for user/machine interfase, iD always used DirectInput, for the people without EAX or A3D, DirectSound is the choice. I´m sure there´s a little of DirectPlay under the networking code, too.
DooM 3 will obviously need DX8 or superior installed, but it will render under OpenGL/Glide .dlls.

Share this post


Link to post

Opengl is the KING!
In recent interview croteam said that with every release Directx is getting close to the KING Opengl!!
And JC chose it for god's sake!
The only problem with it are the drivers!
ALL the bugs can be fixed in drivers in Opengl!
And there are some companies (like ATI) that still CAN'T write a normal Opengl driver!
Company like Nvidia has a PERFECT Opengl driver!
Bottom Line with the release of Directx 8 Microsoft got one step closer but I think the KING chair will ALWAYS belong to Opengl!

Share this post


Link to post

Some companies, such as 3DFX and ATI, do have to release and rerelease drivers every now and again for improved performance.

Oh, and the nVidia problem with UT? There's a fix for the GeForce 2 that takes care of all the missing textures and performance issues.

Nobody's perfect. ;)

Still, I'm stuck on a Voodoo 3. I'm thinkin' 'bout gettin' an GF2MX cheap so I can play UT with hi-res textures and in 32 bit color now that all the drivers have been perfected.

Hell, in Q3A just going from a Voodoo 3 to a GF2MX will allow me to crank all the features up, and go into 32 bit color, and STILL get a higher framerate. :)

OpenGL probably will always be king, though... but you never know in this industry. Just a scant few years back we all thought we'd always be buying the latest 3DFX card... then it turns out that the 4's and 5's completely suck cock.

Share this post


Link to post
Dima said:

Opengl is the KING!
In recent interview croteam said that with every release Directx is getting close to the KING Opengl!!
And JC chose it for god's sake!
The only problem with it are the drivers!
ALL the bugs can be fixed in drivers in Opengl!
And there are some companies (like ATI) that still CAN'T write a normal Opengl driver!
Company like Nvidia has a PERFECT Opengl driver!
Bottom Line with the release of Directx 8 Microsoft got one step closer but I think the KING chair will ALWAYS belong to Opengl!

Mmh, you´re right Dima about the drivers issues. It´s the main problem of OpenGL, but that´s because OGL is the opposite of DX. Instead of building generic drivers for base instructions and extra dlls for advanced features, they code new drivers for every new card. I have friends who need to swap between different Detonator drivers in order to play various games.
Each game has it´s own set of problems with every video card available. Each company constantly update their drivers in order to support correctly the new games.
It´s madness!! We´re going back to the pre-DX era.
I hope they change their line of tought quick, or soon we´ll be installing "pseudo-DirectX" API for OpenGL

Share this post


Link to post
deadnail said:

Some companies, such as 3DFX and ATI, do have to release and rerelease drivers every now and again for improved performance.

Oh, and the nVidia problem with UT? There's a fix for the GeForce 2 that takes care of all the missing textures and performance issues.

Nobody's perfect. ;)

Still, I'm stuck on a Voodoo 3. I'm thinkin' 'bout gettin' an GF2MX cheap so I can play UT with hi-res textures and in 32 bit color now that all the drivers have been perfected.

Hell, in Q3A just going from a Voodoo 3 to a GF2MX will allow me to crank all the features up, and go into 32 bit color, and STILL get a higher framerate. :)

OpenGL probably will always be king, though... but you never know in this industry. Just a scant few years back we all thought we'd always be buying the latest 3DFX card... then it turns out that the 4's and 5's completely suck cock.

"Oh, and the nVidia problem with UT? There's a fix for the GeForce 2 that takes care of all the missing textures and performance issues"

1)What r u talking about??
I don't remember this problem!
I only know about a patch that makes the second disk of UT useful!
And UT patches that improve D3D perfomance!
The problem u said is relative to Geforce 2 or Geforce 1?
2)Yes Zaldron u r right!
But with my CLAP (Creative Labs Annihilator Pro based on Geforce DDR) with 6.47 drivers games work PERFECTLY!
ur friend must be having some weird problems with his windows!

Share this post


Link to post
Dima said:

"Oh, and the nVidia problem with UT? There's a fix for the GeForce 2 that takes care of all the missing textures and performance issues"

1)What r u talking about??
I don't remember this problem!
I only know about a patch that makes the second disk of UT useful!
And UT patches that improve D3D perfomance!
The problem u said is relative to Geforce 2 or Geforce 1?
2)Yes Zaldron u r right!
But with my CLAP (Creative Labs Annihilator Pro based on Geforce DDR) with 6.47 drivers games work PERFECTLY!
ur friend must be having some weird problems with his windows!

There's a GeForce 2 patch for Unreal Tournament that's supposed to fix any missing texture problems and speed up the OpenGL support. As far as I know, that is. I don't have a GeForce 2 (yet), so I can be understandably apathetic to the issue. :)

However, I remember hearing a UT developer state that he will not implement DXTC in D3D. Strange... I guess it's just for people with Voodoo 4/5 or OpenGL users then.

From what I hear, GL and D3D support for UT completely sucks ass since the engine was designed for Glide. What a shame... then again, with 1024x768 textures I think I can deal with it when I switch to nVidia! :)

But hey, Unreal 2 is being aimed at nVidia so there's good news!

Share this post


Link to post
Dima said:

"Oh, and the nVidia problem with UT? There's a fix for the GeForce 2 that takes care of all the missing textures and performance issues"

1)What r u talking about??
I don't remember this problem!
I only know about a patch that makes the second disk of UT useful!
And UT patches that improve D3D perfomance!
The problem u said is relative to Geforce 2 or Geforce 1?
2)Yes Zaldron u r right!
But with my CLAP (Creative Labs Annihilator Pro based on Geforce DDR) with 6.47 drivers games work PERFECTLY!
ur friend must be having some weird problems with his windows!

yep, deadnail´s right. I remember that when the first GeForce came to the market (i think it was the ELSA Gladiac), UT suffered from a weird Z-buffer problem, you could see trough walls and the decals get screwed. Some textures banished too.

One of my friends have a GeForce 2, on windows 2000 (imagine...), and you couldn´t believe the amount of problems he suffered. Nothing than a format C:\ /S couldn´t fix.
I have another friend with a Creative´s TNT2 Ultra (don´t remember the name), who needs to switch Detonators to play Elite Force and NFS 5, but he´s still on Win98, so i can´t understand why.

Share this post


Link to post
deadnail said:

There's a GeForce 2 patch for Unreal Tournament that's supposed to fix any missing texture problems and speed up the OpenGL support. As far as I know, that is. I don't have a GeForce 2 (yet), so I can be understandably apathetic to the issue. :)

However, I remember hearing a UT developer state that he will not implement DXTC in D3D. Strange... I guess it's just for people with Voodoo 4/5 or OpenGL users then.

From what I hear, GL and D3D support for UT completely sucks ass since the engine was designed for Glide. What a shame... then again, with 1024x768 textures I think I can deal with it when I switch to nVidia! :)

But hey, Unreal 2 is being aimed at nVidia so there's good news!

1)No matter how good UT engine will be optimised for Nvidia cards it will NEVER reach the same perfomance as Quake 3!
Because it was originally built with Glide in mind and because the upgrade will still work under Direct3D which SUCKS!

Share this post


Link to post

The next generation Unreal engine is aimed to Nvidia tech. I hope they rewrite the inner workings of the core / renderer. You´re right, no matter how they code the Opengl.dll or the D3D.dll, the engine logic was founded under software and glide APIs. Until Unreal 2 came to scene, Quake´s the winner. JC has based the code since Quake 2.

Speaking of next-gen Unreal... they have bones on the mesh, ok that´s good. They have greater polycounts and LOD, that´s good too.

But they haven´t implemented NURB´s, not even shaders, and they don´t plan to include any bump-mapping.
If Unreal 2 takes some more time to show up, it will look horrific compared to DooM 3. The only reason why i should buy it it´s because the fps will be bigger.

Share this post


Link to post

1)Did u see the new unreal technology movie?
2)The article on dailyradar claimes that the upgraded technology can render character made of 3700 polygons but I heard that Julie from Fakk 2 Heavy Metal is made with 5000!!
2)Well it will be intresting to see what can they come up with!
As I don't believe that the upgrade will be something sagnificant...

Share this post


Link to post
Guest JudgeDooM
deadnail said:

Some companies, such as 3DFX and ATI, do have to release and rerelease drivers every now and again for improved performance.

Oh, and the nVidia problem with UT? There's a fix for the GeForce 2 that takes care of all the missing textures and performance issues.

Nobody's perfect. ;)

Still, I'm stuck on a Voodoo 3. I'm thinkin' 'bout gettin' an GF2MX cheap so I can play UT with hi-res textures and in 32 bit color now that all the drivers have been perfected.

Hell, in Q3A just going from a Voodoo 3 to a GF2MX will allow me to crank all the features up, and go into 32 bit color, and STILL get a higher framerate. :)

OpenGL probably will always be king, though... but you never know in this industry. Just a scant few years back we all thought we'd always be buying the latest 3DFX card... then it turns out that the 4's and 5's completely suck cock.

Damn, you're still stuck on a Voodoo 3? Wow, in deed, it's time for a change. I did so. Now I have a 3D Blaster GeForce 2 GTS, and, man, does it feel good!! I was sick with that Voodoo 3, who displays 8 bits textures and where you see a distinct line separating different tones of a color, if you know what I mean. Voodoo 3 has been quite a bad experience...

Share this post


Link to post
Guest Dream Destroyer

I have voodoo 1......

Share this post


Link to post
Guest JudgeDooM

Ow, that's even worse!! also time for a change for u :)

Share this post


Link to post
JudgeDooM said:

Damn, you're still stuck on a Voodoo 3? Wow, in deed, it's time for a change. I did so. Now I have a 3D Blaster GeForce 2 GTS, and, man, does it feel good!! I was sick with that Voodoo 3, who displays 8 bits textures and where you see a distinct line separating different tones of a color, if you know what I mean. Voodoo 3 has been quite a bad experience...

Ooh, i know REALLY what you mean!
I was looking at the Q3A model skins with pakscape (a pk3 utility) and i couldn´t believe the quality i was missing.
Each skin is a 24 bit tga, while Glide reduces the color depth to 16.

Share this post


Link to post
Dima said:

1)Did u see the new unreal technology movie?
2)The article on dailyradar claimes that the upgraded technology can render character made of 3700 polygons but I heard that Julie from Fakk 2 Heavy Metal is made with 5000!!
2)Well it will be intresting to see what can they come up with!
As I don't believe that the upgrade will be something sagnificant...

I knew that the models in FAKK2 were good and all, but I didn't know they were 5000 fucking polys good! Wow! No wonder that ass looked so round... wait, I've only a P3-450, 256 SDRam, and a Voodoo 3... how the hell did I get such a smooth framerate?

Quake 3 engine. Nuff sed I guess. ;)

Anyway, Unreal was designed from the ground up to be fully utilized by 3DFX cards since the Voodoo 2 was in it's heyday during construction.

Unreal 2 is going to have the graphics part designed again from the ground with with nVidia in mind this go around.

Share this post


Link to post
Zaldron said:

yep, deadnail´s right. I remember that when the first GeForce came to the market (i think it was the ELSA Gladiac), UT suffered from a weird Z-buffer problem, you could see trough walls and the decals get screwed. Some textures banished too.

One of my friends have a GeForce 2, on windows 2000 (imagine...), and you couldn´t believe the amount of problems he suffered. Nothing than a format C:\ /S couldn´t fix.
I have another friend with a Creative´s TNT2 Ultra (don´t remember the name), who needs to switch Detonators to play Elite Force and NFS 5, but he´s still on Win98, so i can´t understand why.

the first 2 geforce cards released were the creative annihilator (pro?), and the hercules one i think (i'm certain about the creative one

Share this post


Link to post
Guest fraggle`
Dima said:

Opengl is the KING!
In recent interview croteam said that with every release Directx is getting close to the KING Opengl!!
And JC chose it for god's sake!
The only problem with it are the drivers!
ALL the bugs can be fixed in drivers in Opengl!
And there are some companies (like ATI) that still CAN'T write a normal Opengl driver!
Company like Nvidia has a PERFECT Opengl driver!
Bottom Line with the release of Directx 8 Microsoft got one step closer but I think the KING chair will ALWAYS belong to Opengl!

John Carmack said it so it must be right!! carmack is a smarty man! and directx is a clown boat!

Share this post


Link to post
Guest ReelExterminato
Zaldron said:

The next generation Unreal engine is aimed to Nvidia tech. I hope they rewrite the inner workings of the core / renderer. You´re right, no matter how they code the Opengl.dll or the D3D.dll, the engine logic was founded under software and glide APIs. Until Unreal 2 came to scene, Quake´s the winner. JC has based the code since Quake 2.

Speaking of next-gen Unreal... they have bones on the mesh, ok that´s good. They have greater polycounts and LOD, that´s good too.

But they haven´t implemented NURB´s, not even shaders, and they don´t plan to include any bump-mapping.
If Unreal 2 takes some more time to show up, it will look horrific compared to DooM 3. The only reason why i should buy it it´s because the fps will be bigger.

Yeah, I think.. er know that the doom3 engine will be better than Unreal2, but how on earth can you put nurbs natively in a hardware-accelerated engine??

Share this post


Link to post
Guest ReelExterminato
Charon said:

hehehe...He said 'Trident'...

hehe... I still have an 8900C isa lying around somewhere :)

I don't think there is any direct play involved as he'd want as much control over the networking as possible, and he's never supported glide.

Share this post


Link to post
Guest Daemoneon
JudgeDooM said:

Damn, you're still stuck on a Voodoo 3? Wow, in deed, it's time for a change. I did so. Now I have a 3D Blaster GeForce 2 GTS, and, man, does it feel good!! I was sick with that Voodoo 3, who displays 8 bits textures and where you see a distinct line separating different tones of a color, if you know what I mean. Voodoo 3 has been quite a bad experience...

I'm sorry, man, but the Voodoo3 is still a really nice processor for graphics. I mean, compare it to a lot of standard factory-installed graphics cards (no me gusta ATI) and 16 mb is still a good amount of memory for a video card. The core clock speed may be a little slow to some's liking, but there are many good overclocking programs that will work wonders for this problem, with very few side effects, if any. So don't dismiss old technology as being useless. Remeber, you can't play your old Beatles records (or your White Zombie collectors edition LPs) on a dvd player.

Share this post


Link to post
ReelExterminato said:

hehe... I still have an 8900C isa lying around somewhere :)

I don't think there is any direct play involved as he'd want as much control over the networking as possible, and he's never supported glide.

Kewl! I think I have one too...

Now where is that frickin' thing, probably works better than this crap ATI card I have now.

Share this post


Link to post
Guest
This topic is now closed to further replies.
Sign in to follow this  
×