Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Sign in to follow this  
Julian

GeForce FX Stamped

Recommended Posts

According to an article on ign.com, id software announced that GeForce won the golden ticket... I mean: sticker.

id Software announced today that they officially recommend the NVIDIA GeForce FX series of graphics cards for DOOM 3.
To emphasize the endorsement, a special sticker has been designed for GeForce FX packaging indicating that the product is "Recommended by id Software for DOOM 3."

I couldn't find anything about it on id's site. However, the fact Carmack et al would have taken a final decision regarding this issue could somehow confirm the imminence of a release.

EDIT: There seems to be more information on a page nVIDIA has set up regarding Doom 3.

Share this post


Link to post
Julian said:

However the fact Carmack and al would have taken a final decision regarding this issue could somehow confirm the imminence of a release.

This final decision probably involved "money" and "ATI having leaked an alpha".

Share this post


Link to post
Fredrik said:

This final decision probably involved "money" and "ATI having leaked an alpha".

I love how mean you can be ;)

Share this post


Link to post

carmack has always said that nvidia's opengl drivers are his "gold standard" for development, and the fact that nvidia actively tries to release cross-platform drivers for linux, freebsd, and mac os also bolsters their case.

i'm still an nvidia fanboy tho, i'm sure arioch will come in and yell some ATI propaganda too

Share this post


Link to post
mewse said:

i'm still an nvidia fanboy tho, i'm sure arioch will come in and yell some ATI propaganda too


In that case, we'd better come prepared, so: nVidia ownz!

And yes I have a FX5950 Ultra...3DMark03 score of 6040.... suck on it!

Share this post


Link to post
c-cooper said:

In that case, we'd better come prepared, so: nVidia ownz!

And yes I have a FX5950 Ultra...3DMark03 score of 6040.... suck on it!

Hm, $200 more in my pocket, or 200 more 3dmark03 points? Let me think on that ...

Share this post


Link to post
c-cooper said:

In that case, we'd better come prepared, so: nVidia ownz!

And yes I have a FX5950 Ultra...3DMark03 score of 6040.... suck on it!


I need to get a 5700U or higher. My 5700 only brings 3000. :-/

Share this post


Link to post
Arioch said:

Hm, $200 more in my pocket, or 200 more 3dmark03 points? Let me think on that ...


200 pts more? What CPU etc. do you have then, since 3DMark also runs a CPU test? And do you have a 9800XT? I doubt you can get that score with a 9800 Pro, but feel free to prove me wrong ;)

Share this post


Link to post

The funny thing is this will affect peoples decisions, just like people ran out and bought 9800's based on HL2.

Share this post


Link to post
c-cooper said:

200 pts more? What CPU etc. do you have then, since 3DMark also runs a CPU test? And do you have a 9800XT? I doubt you can get that score with a 9800 Pro, but feel free to prove me wrong ;)

Athlon XP "Barton" 3200+, 1 gig of dual channel PC3200 (2x512), 9800 Pro.

Share this post


Link to post

I had two GeForce 4 Ti 4200 cards die on me so I decided that the next card I would get would be an ATI, even though I'm not very happy with ATI's drivers.

Share this post


Link to post
AgentSpork said:

I had two GeForce 4 Ti 4200 cards die on me so I decided that the next card I would get would be an ATI, even though I'm not very happy with ATI's drivers.


That pretty much sounds like me. I had 3 GeForce 4 Ti 4600 die on me, so I decided to buy a Radeon 9700Pro....fast card...the drivers are pretty bad, though.

Share this post


Link to post

Wooters my dad picked this card to buy for my new computer, lucky me.

Share this post


Link to post
Planky said:

Isn't the AMD Barton the celeron of Athlon XP's?

Sure if you make a Celeron out of a Pentium by doubling the L2 Cache.

Although the Barton, much like the Celeron 300A, is trivially overclockable.

Share this post


Link to post

Heh. I wonder where I got the impression that barton's where just stripped down Athlons then.

Share this post


Link to post

That would be the Duron that has half the L2 cache. But they haven't made any Durons based on the Palomino, Thoroughbred, or Barton cores, only the Thunderbird.

Share this post


Link to post

I hate all this 3d card warz crappola. I HATE seeing a 5 second nVidia logo before I start every game that nVidia has 'bought out'. Now i know Id Has beef with ATI for whatever reason(s), but you dont see ATI splash screens now do you. And 'The Way it's Meant to be Played'?? Fuck that, I cant believe anyone with a solid head on their shoulders would go out and buy an ATI for HL2 or a nVidia for Doom3. Hell! Buy them both and swap them in and out whenever you want to play either game! Problem solved.

Me, I would get the 9800xt based solely on personal preference, and not 'omfgz0rz! D00m3 wi11 rUn 0n1y on nV1d14!!11eleven'. Substitute ATI and HL2 in there if you choose.

Get a card, one you like. After the game ships there will be handfulls of 3rd party tweaks for both games for both cards. So its trivial to buy a card because <this game> will have a shiny little sticker and a 5 second splash screen advertizing its superiority. If you do, youre a corporate sheep.

Share this post


Link to post

Szymanski said:
The funny thing is this will affect peoples decisions, just like people ran out and bought 9800's based on HL2.

Games releases are invested on and financed for that reason, so it's not so strange.

Share this post


Link to post

Technically, that's not true. AMD Duron parts at 1Ghz and higher are based on the "Morgan" core, which is essentially a Palomino with the standard Duron stripped-down cache. AMD Duron parts below 1Ghz use the "Spitfire" core, which is essentially the same thing, but based on the Thunderbird, as Bloodshedder said.

Regarding the news, I'm not really surprised. GeforceFXes ARE more capable DirectX9 parts. In theory. Even still, I'd rather have the higher performance in DX6, 7, and 8 games - since that's what virtually everything out right now is anyway. Tim Sweeney recently said he doesn't expect to see "real" DX9-level games until late 2005. I wouldn't be surprised at all if he's right.

My dad's Radeon 9500 Pro runs Painkiller (A primarily DX7 game) at 1024x768x32 with 4x MSAA and "16X" aniso just beautifully. My own Radeon 9800 Pro (thanks, arioch :o ), naturally, is even faster. And I'm pretty sure both will serve us well for Doom 3, just as they do for Far Cry.

The bottom line is this - it really doesn't matter that much what you buy anymore. Just make sure you get a good product for your money, and that's easiest accomplished by shopping around. 'Nuff said on the topic.

Share this post


Link to post
Sign in to follow this  
×