Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Creaphis

If I ever needed tech help before I need it now!

Recommended Posts

I figure it's finally time to get a real graphics card. (If I'm lucky, my family will pay - 'tis the season.)

Here's my computer:
http://support.gateway.com/s/PC/R/1008842/1008842sp3.shtml
Yes I know it's shit. It works, though, so don't tell me to get a new one, because I won't. My plan here is to just get something to fill its PCIe 16x 1.0 slot.

What I want:

-A card that will be good enough for my uses for a long time. (I was just reading on Wikipedia that the PCIe 2.0 spec is backwards-compatible with the 1.0 so I expect to be using this card even in the next motherboard that I own.) Keep in mind I'm not hardcore - an integrated graphics chip equivalent to a GeForce 6100 or 6200 has been good enough for me for quite a while, so any upgrade is likely to be good enough for a very long time as well.

-More specifically, the card should handle GZDoom's future renderer and the Orange Box very well (I've been sitting on Half-life 2, unable to play it, for too long.)

-Affordable. Even if I trick my parents into paying for it I still want to go easy on them.

Questions:

-NVIDIA versus ATI (Strengths? Weaknesses? Compatibility issues? Driver bugs?)

-I have absolutely no idea what all of those numbers mean in a video card's spec. What am I looking for?

-Are there other features that some cards have that I've never heard of, that I could never even imagine but that I'll absolutely need once I learn about them?

-Oh yeah, I'm using XP, so that might be relevant to the "driver issues" question above.

-Power: I have a 300W power supply. I want to ask "How much is enough?" but of course this depends on the video card and on all of my current hardware, so, another question: how can I find out how much power my computer is currently using, so I know how much there is to spare? I am willing to buy a better power supply if necessary.

-Ventilation: Again, I want to ask "How much is enough?" There's nowhere on the outside of this case that another fan could go so I just hope that a couple fans on the back is enough. If I had to get another case to upgrade my graphics card then I'd probably just postpone this upgrade again.

Feel free to just tell me what to buy instead of spending much time answering my questions. I'll blindly trust you.

Thanks.

Share this post


Link to post

If you get something TOO new, your computer will be the bottleneck and you won't see any benefit. You'll be limited to cards that are at least two years old. A 7000-series GeForce is your best bet, considering you run XP and DirectX 9 is as good as you're going to get.

So the final issue is whether you have a 6-pin power connector for PCI-E cards. Check your PSU to see if there is one available. (If not, you can try to free up 2 4-pin power connectors, the kind used for drives, and use an adapter. The card may even come with one.) If you don't have any free, you won't be able to support anything much more powerful than a 7600GT.

A 7600GT will probably run Doom 3 on max graphics settings with minimal slowdown. You may want to get an extra gigabyte of RAM, though.

Share this post


Link to post

I dispute this claim you need to be running the newest DirectX to get benefit from the newest cards. Absolute crap! My bro gave me a 4850 because he owed me a pile of cash and games run faster. Faster is good. :)

Share this post


Link to post
Bucket said:

If you get something TOO new, your computer will be the bottleneck and you won't see any benefit. You'll be limited to cards that are at least two years old. A 7000-series GeForce is your best bet, considering you run XP and DirectX 9 is as good as you're going to get.


Again this "computer bottleneck" bullshit...I got sick of reading it, also because people are tricked into believing that your computer will somehow magically slow down just because you plugged in a card that's "too powerful". I've seen 3x differences in 3DMark03 from plugging in a 7600GT in place of an ATI RAdeon 9600XT, so there's quite a large leeway, and there would be probably even more if I plugged in an ATI 4650 with 120 unified shaders.

Get it in your thick head people:

  1. Faster GFX card equals faster rendering for a given scene, all the rest being the same.
  2. Faster rendering for a given scene means higher FPS regardless of CPU, or until you saturate the GFX bus (don't expect a 4x or 8x AGP to match a PCIe 2.0 card in uncapped performance).
  3. ????
  4. PROFIT!!!
The only case where a card alone wouldn't help much would be with a game that's particularly CPU intensive. Don't expect to be able to play a game designed for dual-core CPUs on a Tualatin-core Celeron (it will have very uneven performance). However most modern games are actually PhysX rather than purely logic/CPU intensive. And guess what, those can actually be accelerated in hardware by CUDA-enabled nVidia cards, so there has been no better time to pop in an 8-class nVidia or better card. It will be much more beneficial than an equivalent ATI card, in that respect.


A 7600GT will probably run Doom 3 on max graphics settings with
minimal slowdown. You may want to get an extra gigabyte of RAM, though.


Yeah, a 7600 used to have a good performance/price ratio (and still has) but an 8600 or better will also offer PhysX acceleration and actually put the PCIe bus to good use, even with DirectX 9.0 games. That will help the old CPU immensely. Just make sure it's an 8600 or better, the older ones and lesser models don't have PhysX acceleration and you'll have "just" a fast GFX core, while many modern games need either dual-core CPUs or PhysX acceleration.

Share this post


Link to post
Khorus said:

9800gt is what I'd get if I wanted performance for little cost. It's about $100 AU here.


LULWUT?!!! That gets smoked in performance by an nVidia 7600 GT (at least twice as powerful, we're talking unified shaders vs old-style architecture here), and besides it's AGP 8x only (I don't think you meant an nVidia 9800 here, did you?). In turn, an ATI 1650 can barely hold its own vs a 7600 GT, with neither having CUDA PhysX acceleration (let alone that trying to find older GPUs will usually result in paying way too much compared to their actual capabilities, unless you really need a vintage piece of hardware).

Get an nVidia 8600 or better (and in any case, x600 and above) as Jodwin said, and breathe new life into your Athlon 64 with hardware PhysX acceleration and GPGPU power, which no ATI card can offer as of now. Great performance:price ratio, great versatility, and future proofing too. What more could you ask for?

Share this post


Link to post
caco_killer said:

And 9800GT runs circles around the 7600 GT and 8600 GT.

And works wonders on the 300W PSU. ;)

Share this post


Link to post
Maes said:

LULWUT?!!! That gets smoked in performance by an nVidia 7600 GT (at least twice as powerful, we're talking unified shaders vs old-style architecture here), and besides it's AGP 8x only (I don't think you meant an nVidia 9800 here, did you?). In turn, an ATI 1650 can barely hold its own vs a 7600 GT, with neither having CUDA PhysX acceleration (let alone that trying to find older GPUs will usually result in paying way too much compared to their actual capabilities, unless you really need a vintage piece of hardware).


What are you on about? There isn't a Radeon 9800GT (there was XT), and the average end user doesn't care about unified shaders or other such terms, just the end result. And those cards you're talking about are much to old to be good value for money.

Not to mention;

You cannot future proof. That's just not possible. And 9800GT runs circles around the 7600 GT and 8600 GT.

Share this post


Link to post

I'm fairly sure there was a Radeon 9800 GTO (at least there was an x800 GTO), but yeah, if you meant the newest nVidia 9800 series, then yeah ofc it's better than 7600 and 8600 (and has CUDA ;-)

I assumed that you (Khorus) jumped on the "OH NOES!!! OLD PC!!! DON'T PUT T3H P0W3R GFX INSIDE, IT SI T3H W4STE UR NOT 1337!!11!!" bandwagon and actually suggested a Radeon 9800, of which I thought there was a GT/GTO version.

About unified shaders...well...it's practically impossible not to find them on almost anything designed after 2005. The old vertex/rendering/pixel pipeline divisions are only found on nVidia cards <7x00 and ATIs <2xx0. Still, ATIs lose a lot of value for money by not offering CUDA PhysX acceleration with the unified shaders architecture yet.

Share this post


Link to post

For what Crephis wants I would recomend an Nvidia 9600GT which really aren't too expensive these days and in my experience can run most games on the best settings available with a half decent processor (his athlon should just get away with it). It should defenatly be good for Half Life 2/Ep1/Ep2/Portal.

Share this post


Link to post

I know nothing about graphics cards other than to avoid ATI cards like the fucking plague. They don't get updates frequently enough, the latest firmware is buggy as hell and Graf Zahl will probably strangle you in your sleep if you get one.

ALSO: I'd recommend a GeForce 8xxx, my only justification for this is that this is the minimum requirement for GZDoom's new renderer.

Share this post


Link to post
Patrick said:

Graf Zahl will probably strangle you in your sleep if you get one.


Holy shit, and I got four of them (ATI Rage PRO II, 9600XT, and two x1650). Should I sleep with a metal gorget and metal underwear for that kind of blasphemy?

Share this post


Link to post
Patrick said:

I know nothing about graphics cards other than to avoid ATI cards like the fucking plague. They don't get updates frequently enough, the latest firmware is buggy as hell and Graf Zahl will probably strangle you in your sleep if you get one.


I've never had a problem with ATI. 9800 Pro is the best card I ever owned, and I'm currently running an HD 5850 which is also excellent.

EDIT: With that said my last video card was a Geforce 7950 GT, which was really solid. Oh but you're right about the drivers.

Share this post


Link to post
Jodwin said:

All I can say is, stick to NVidia. And avoid the value models (x400 or less).


I have an ATI HD Radeon 3200 and it has never let me down yet. Its not as good as a Geforce 8300, but it runs Doom 3, Team Fortress, Left 4 Dead and such flawlessly.

Share this post


Link to post

I would get an ATI4670. There probably isn't much point getting anything faster, your CPU is already the bottleneck. They cost less than AU$100 here in Australia. The price-equivalent nVidia card is a 9600GT, and the ATI will outperform that:
http://www.anandtech.com/video/showdoc.aspx?i=3405&p=7

nVidia's low-end and mid-range products are simply inferior and overpriced compared to ATI's offerings. This is fact. Personally, I'm dirty on nV for their shitty insulting tech support, lack of support for software freedom, and their blatant ripping off of consumers by renaming their model lines and remarketing them as something new.

Yes, ATI have had crap drivers in the past, but this has all been resolved these days. It's kinda like saying Doom is a crap game because there were bugs in version 1.1, despite the fact these have been patched out and modern ports continue to improve the experience.

Share this post


Link to post

But ATIs don't have CUDA/PhysX acceleration, which would be exactly what an old single core CPU would need. It's pointless getting e.g. a 4650 with 120 unified shaders if they can't also be used for GPGPU/PhysX, like the nVidias can.

Share this post


Link to post

Alright, because Maes clearly has the biggest penis here, I'm trusting him that "newer" equals "better." Also, because of features that have been mentioned, and because I want to narrow things down, I decided to stick to nVidia cards, x600 models or better. Now I'm hoping to avoid having to replace my PSU, so I'm avoiding the most power-hungry cards, of course. Both the 9600 GT and 9800 GT cards use around 100 W and are sold around the same price point so both make strong candidates. I learned that there's actually a low-power edition of the 9600 GT out there, but they're hard to find, and are usually made by poorly-reviewed manufacturers when I do find them, so I guess I'll just have to trust that my PC isn't already using more than 200 W (and I'd be surprised if it was).

So, anyways, that final decision can wait, as one more thing needs to be considered... heh, I forgot that my current LCD monitor actually needs analog input. So, a new question:

-I'm tempted to just buy a cheap DVI to VGA adapter as a stop-gap solution, but I know that this, of course, is a lossy conversion. The question: how much of the gain of upgrading a video card would I lose again by doing this to its output?

Though I guess what I see on my monitor right now is getting converted from digital to analog and back to digital, and I've never had any complaints about the resulting image, so I'm betting that converting the video card's output wouldn't make much difference to me.

All the same, maybe I should use this as motivation to upgrade my monitor as well - it's a 17", and I wouldn't mind it being widescreen or something. I give you all permission to tell me about your favourite monitors.

EDIT: Oh hey, people are posting!

Super Jamie said:

PhysX I do not care for, and nor does Creaphis with his requirement to play GZDoom and Half-Life 2 nicely.


I mentioned those games simply as a starting point. If I have a computer that's capable of playing newer games with CUDA and PhysX and all of nVidia's other buzzword technologies, then I probably will.

Also, I appreciate the input, but it seems like the debate here is between people who say "nVidia is better than ATI" and people who say "No, ATI is just as good." Even if ATI cards are as good as nVidia's, what makes them better? I'd rather just keep my focus narrowed on the GeForce line until someone can conclusively show that ATI cards have some significant advantage.

Share this post


Link to post

You're making decisions based on penises? What are you, Doom Marine? :P I wouldn't go nVidia, especially on a low-end card like these, you're just cheating yourself out of higher framerates.

I've never noticed any issue using a DVI to VGA converter. It's certainly no worse than just using VGA.

I have a Samsung SyncMaster 24" LCD. It's one of the more expensive ones of the line, and is definitely the best monitor I've ever used. It's capable of being very bright and the contrast is great. The native res is 1920x1200 and it can scale all the way down to 320x200 DOS resolutions without appearing blurred at all.

Each brand tends to have good and bad models, onve you've decided on your pricerange and size, definitely do some research of reviews outside of here before purchasing.

Share this post


Link to post
Creaphis said:

Also, I appreciate the input, but it seems like the debate here is between people who say "nVidia is better than ATI" and people who say "No, ATI is just as good." Even if ATI cards are as good as nVidia's, what makes them better? I'd rather just keep my focus narrowed on the GeForce line until someone can conclusively show that ATI cards have some significant advantage.

Read the link I posted. The ATI card reviewed consistently beats the nVidia cards reviewed in several benchmarks. Almost all nVidia-vs-ATI graphs in the same pricepoint will read the same, some with ATI significantly in front. Depending on which way you look at it, ATI are making cards with the same performance cheaper than nVidia are, or they are making cards at the same pricepoint which are capable of superior performance. That certainly seems "better" to me! ATI are also a better choice ideologically, which matters to me.

CUDA/OpenCL is a debate you'll have to apply to your ideal functionality. Do you want to encode videos in Adobe Premiere with CUDA? You're going to need to buy a $1500 Quadro card and the paid Adobe plugins which are retailing for $250-$500. If you want to encode DVDs into AVIs then different encoders support different standards, so look at the specific software you're using to see if it's CUDA/AVT/OpenCL enabled. Given that CUDA is proprietary, closed and requires licensing, and OpenCL is open and unencumbered, I would expect OpenCL will take over in future. Case in point, Apple have built OpenCL into OSX 10.6 Snow Leopard. Either way, GPGPU technology is still in its' very early stages, I'm estimating your use of either standard to be minimal to none.

If you have a PhysX-enable game you specifically want to play then nVidia appears to be the obvious choice. However, given the slow CPU and aging hard drive you're using, you're probably going to have to turn down some settings to have brand new games playable anyway, so advantages of PhysX may become minimal against this. I am not sure, I don't really have any experience with PhysX. There was also talk of hacking PhysX support back into ATI cards, though I don't know where this is at right now.

Also, use the Power Supply Calculator to determine whether you actually do need a new PSU or not. I suspect you might.

Edit: I calculated 252W with ATI 4670, 267W with 9600GT. That's assuming 2 sticks DDR, 1 SATA hard drive, 1 DVDRW. If you have more than one internal hard drive you'll need a new PSU.

Share this post


Link to post

Hm.

I guess I could respond by saying that having the highest possible framerate for my money isn't of the utmost importance to me, but I've already used this PhysX technology in my arguments, so that would appear a little bit schizophrenic, wouldn't it? So, let me say some other things instead.

-"Better" is determined by a number of things, including driver stability, and the developers in this community tend to run into more trouble with ATI cards. To feel comfortable getting one of those I would need some sort of proof that there's no justification for their bad reputation.

-CUDA/OpenCL isn't a debate I'm going to be personally invested in. At the moment I can't imagine that I'll do any video encoding.

-In general, my shopping philosophy is that (1) there is always a better item out there and (2) there is always a better price out there for any given item. In other words, (3) it's impossible to buy the "right thing." Therefore, once a fair amount of effort has been spent on looking for the right thing to buy, (4) one should buy the best item of which he is currently aware, instead of further pursuing unattainable shopping perfection. While some ATI card or other may be better than its GeForce equivalent, I've already spent enough time looking. I found the 9600 GT. I'm satisfied.


I ran through that power supply calculator and got similar results. Yes, I just have one hard drive. So yeah, I'll postpone replacing the PSU. Out of curiosity, though, what can happen if your computer attempts to use more power than the PSU can supply?

Share this post


Link to post

Fair enough. As long as you have made the best and most future-proof choice for your purposes based on all features (rather than buzzwords for technology you'll never use anyway) then you've made the right choice!

Driver wise, GZDoom under Linux has previously had issues with ATI, I'm not sure if this is fixed yet. AvP won't run for me under Windows but it will in Wine (go figure?). I've also had other Linux-based issues with nVidia which their tech support were complete knobs about, but that's another story.


If you try to draw way too much power by plugging in over 9000 hard drives and hitting the on switch, you'll just outright fry the unit, some component will pop and you're up for a new PSU.

Depending on the quality of the power supply, some can perform at the limit or beyond their rating. Antec are a good example of this, their NeoPower, TruePower and EarthWatts ranges only start to get flaky only when you load them up 100W past their rating. This overengineering helps components remain cooler and last longer, but they do cost a bit. Conversely, some cheap generic power supplies start to die well below their sticker rating. You get what you pay for.

Running your PSU at or close to its' limit for extended amounts of time will cause the internal components to work harder and heat up more. Power will fluctuate which may manifest itself as software errors (Windows bluescreens, data corruption, etc) or you might not see anything. Eventually some component will get stressed enough that it can't hold on any longer and it'll pop and you're up for a new PSU.

Watts is not the only thing important to power supplies, though it's a good overall indicator. You need to look at amperage per power rail and what each of your components will be drawing from where it's plugged in, especially if running close to the limit.

Share this post


Link to post

You really do need to find out who makes your PSU though.

The problem with that PSU calculator is that it doesn't specify the amount of power on each rail you will need. 267W usage with 240W of that coming off the 12V rail is a whole different thing that 267W with only 160W coming off the 12V rail.

Looking at the Gateway website, your computer uses either one of three PSU's. A Delta made one (which is what you should hope for), a Bestec made one (meh), or some no brander generic piece of garbage (ugh).

Although the Delta one is a decent unit (and by far the best of the bunch), all of these are older PSU's with most of their power distributed among the 3.3V and 5V rails. Modern computers, (and your future graphics card) on the other hand, take most of their power from the 12V rail. This is where you start to run into issues with using the number the PSU calculator gives out with comparison to your PSU's total wattage output.

IMO, out of the three possibilities for PSUs you currently have in your system, the only one I would even try to run a 9600GT with (and even then only the low power version - IE the one without the 6pin PCI-E connector) is the Delta. With the other two you are asking for trouble.


EDIT:If you care to hear my opinion, I'd personally buy this (it's only $10 more than their 9600GT), and pair it with this or this, depending on what your budget allows (you never did specify a budget).

Share this post


Link to post

Yeah, I was looking at my computer's specs and the funny thing is that even I have no way of knowing which PSU is actually in there without cracking it open... which I think I'm going to do. BRB.

Share this post


Link to post

What!? A tech help thread that started without me? I must be slow here bawhahahahaha.

Okay, back it up on the graphics card discussion, I'm not the last person who will say this: GO WITH ATI. Why? I'll tell you why:

1) Nvidia's cards are overpriced. The price premium you pay for Nvidia's card over ATI is actually for marketing (fooling the masses) into believing their product is superior (when it's clearly not).
2) Nvidia rebrands their technology (They sell old technology advertising it as new, turd polishing, etc. etc. etc.).
3) Their business practices are simply attrocious, they have screwed over consumers six ways to Sunday. They have even faked their newest technology in front of millions of people. Shame on them!

4) This is the most important one, and Super Jamie can back me up on this: ATI currently has the better card lineup (superior performance for the money).

Driver's issue? All I smell is a bunch of fear, uncertainty and doubt. I can testify myself that drivers issues are non-existent as far as my Radeon HD 2x4890 is concerned, and as far as the overclocker community is concerned, both the red and green team have their share of drivers issue)!

PhysX, CUDA, and any other proprietary technology will go extinct sooner or later. They are gimmicks that Nvidia tout like it's the best thing since sliced bread, but honestly, games like Crysis, Dragon Rising, and Half Life 2 all uses Intel's Havok engine, which is not PhysX.

Consider this: PhysX shows no superiority over Havok whatsoever.

Games that uses PhysX can be counted with fingers, and the only significant one is Batman: AA, for which the implementation of PhysX has virtually NO effect on gameplay.

Of future games like Starcraft 2, Blizzard opted for Havok as the choice physics middleware, why? Because they know that choosing PhysX will alienate half of their consumers (ATI users won't be able to run PhysX). This is why most games don't use PhysX: proprietary technology hurts consumers.

With current industry open standards like OpenCL and DirectX 11, proprietary technology like PhysX is backwards and unfair in the way it's implemented, and will likely be extinct in the future.

In the light of superior ATI performance and proprietary gimmicks like PhysX being irrelevant, I cannot see why anyone would go with their gut instinct of Nvidia. If you are using someone else's money, all the more important to research your product thoroughly!

If you already shopped for an Nvidia GPU, do yourself a favor as a consumer and return it.

Let me reassure you, ATI is 100% the way to go currently.

Share this post


Link to post

Another note is the best price point for a graphics card is generally around the 100 price range. A ~$75 GPU will take a drastic hit in performance relative to the price saved over a $100 GPU.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×