Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Holering

PC SLI/crossfire wasteful?

Do you find crossfire and sli wasteful?  

9 members have voted

  1. 1. Do you find crossfire and sli wasteful?

    • crossfire and sli is fine how it is
      4
    • crossfire and sli is horribly wasteful, destructive, and needs a serious update or alternate rendering option.
      5


Recommended Posts

PC crossfire/sli wasteful?

Most of you that run a PC have probably heard of crossfire and SLI.

Crossfire and SLI allow multiple video cards to work simultaneously. Sounds great, because you can double performance with two identical gpu's. However, it is extremely wasteful with memory. Both cards working together need to mirror each-other's memory to process the exact same data. This effectively chops the available memory in half with dual gpus (or a third with three, quarter with four, etc. More gpus= less memory available).

E.g.: Most motherboards now support at least dual-channel system memory. Obviously you get half the bandwidth with one stick of ram, but use a second stick of identical ram and you end up with both double amount of memory and bandwidth. Why can't SLI and Crossfire do the same? Multi CPU server mobo's have no problem sharing available RAM across all cpu cores available.

Obviously SLI and crossfire can't utilize dual gpu's as a single pool of memory and parallel processing units (shaders, rops, etc). Why is that? Are drivers too dumb? Firmware or bios on cards too dumb?

If drivers with a single card can utilize parallel processing units, shaders, rops, etc so well, why can't the same drivers utilize a second card the same way (simply more shaders, rops, and parallel processors of the exact same type)?

Why do motherboards have so many pcie 16x slots? Nothing uses them except gpu's I think? It seems like such a waste. Double energy used, more than double cost, double heat, unstable performance, abuses bus, and memory available is only one card worth. It's very destructive and wasteful IMO.

At the very least AMD/Nvidia should provide a beta option of some type to allow feedback for. Maybe use 32mb of ram on both cards to store rendering data that both cards can use to control how much ram can be effectively available, without restricting users to one card worth of memory only; at least if they can't write proper drivers to treat multiple GPU cores as a single one.

Kind of reminds me of ps3 vs 360. They both seem powerful, but 360 obviously seems smarter with shared pool of memory for both GPU and CPU cores; and first ever direct X 10 gpu.

Edit:
http://wccftech.com/geforce-radeon-gpus-utilizing-mantle-directx-12-level-api-combine-video-memory/

Share this post


Link to post

Both cards working together need to mirror each-other's memory to process the exact same workload twice.

As far as I know, both SLI and Crossfire use a method called alternative frame rendering to get shit done. Each GPU works in tandem by drawing frames consecutively. GPU1 draws frame 1 while GPU2 buffers frame 2, GPU2 draws frame 2 while GPU 1 buffers frame 3, etc. I'm not sure of all the technical advantages it has over split screen rendering, where all the GPUs work in parallel, but it means that only one pool of VRAM needs to be accessed per frame.

As for why anyone would want a 3-way or 4-way SLI or Crossfire system, welcome to the world of PC hardware enthusiasts with lots of disposable income. More is more.

Share this post


Link to post

You're right. Both cards do AFR or SFR. I meant to say the same data has to be copied to each card in SLI/Crossfire.

Share this post


Link to post

SLI was somewhat useful back when I had a dinky old dual-core Athlon and wanted to take advantage of PhysX features. Fact is, you'll never get twice the performance with twice the GPUs. Some amount of your money will just get thrown in the trash.

Share this post


Link to post

At best, it's a feature with rapidly diminishing returns. Two cards definitively doesn't mean double the performance, and at least at first, with SLI, you couldn't just use any pair of cards, they had to be of the same make and type.

Crossfire was somewhat more advances in that it allowed not only alternate frame, but also partial same-frame rendering by two or more cards, and it first allowed using different types of cards (of course, both Crossfire enabled).

But it's really like hi-end audio, following the 80-20 rule: to get that last extra 20% of performance, you must pay 80% more, let alone that it's a clunky solution putting extra strain on the mobo, PSU etc.

Share this post


Link to post

Dual setups using the same series card will yield a 70% performance gain. Tri setup will give you far less, and Quad will give you half of whatever that would be.

Tri and Quad setups are generally reserved for people that run surround/eyefinity, with every graphical setting maxed and have more money than they know what to do with. The other camp for Tri and Quad are extreme overclockers that masturbate to benchmark results and get into world rankings with their results. Generally they either get their stuff sponsored, or they push their best numbers and then pawn their gear and wait for next-gen to do it all again.

I currently run dual 670s. I bought them together used & they included full cover waterblocks which I could easily integrate into my existing liquid loop. I saved money on both cards and got the blocks for free. They surpassed the top tier 7 series 780GTX at 2/3 the price. Being liquid cooled I got additional performance by increasing their core and memory clocks. Also run my radiator fans a 5V, so silence to boot.

My scenario is more the exception rather than the rule. Now back to SLI & crossfire sucking balls.

Share this post


Link to post

I think the general rule of thumb is that you should only use cf/sli if it's to obtain a level of performance that you can't get with a single card.

Just to give an example with current generation cards, if you have one of the top tier cards

gtx 780
gtx 780 ti
r9 290
r9 290x
gtx 970
gtx 980

getting a second one is reasonable (if you actually need that kind of power) because no single card can match that level of performance.

That being said, if you have one of the second tier cards

gtx 770
r9 280
r9 285
r9 280x
gtx 960

getting a second one is a waste, because you'd be better off selling it and buying a single tier one card.

Share this post


Link to post
Doom_user said:

I think the general rule of thumb is that you should only use cf/sli if it's to obtain a level of performance that you can't get with a single card.

That was an excellent summarized explanation. Which I should've gone for instead of a long winded explanation with personal anecdotes. :D

Share this post


Link to post

DirectX 12 will not require dual gpu setups to mirror memories, instead the memory is going to be combined. With that in mind, I say SLI is worth it, can't say the same about CrossFire thanks to AMD's horrible drivers. Generally if you're using a single monitor setup at 1080p or 1440p (assuming it's not a 120Hz setup) you're better off choosing the best single GPU available. But if you're going for 120Hz or a surround/eyefinity setup, Dual Gpus are the way to go. I'm not sure why the posters above chose to bash the enthusiasts regarding tri or quad card setups, as gaming isn't the only thing graphics cards are meant for.

Share this post


Link to post

It's not really a question of "is it worth it". There are some games where you can't average > 30fps at high res (UHD, 3K, 4K) without SLI, and if you're aiming for vsync at 120Hz or 144Hz then you need SLI for many midrange games too. G-Sync/Freesync can help in a single-card setup, but you're usually gonna average around 60-80fps with dips into 30fps.

Some people think you can game at 30fps; I think for modern games that's stretching it. Doom is fine and I think actually even looks better at 35fps, but it has its own look. If you're talking realistic, cinematic worlds, you need a high framerate, otherwise you're missing the point really.

You're also future-proofing yourself in a big way. You'll be able to play today's games at top spec right now, and you'll be able to play tomorrow's games at today's top spec right now, and etc. etc. If you stay with a single card, you'll have to upgrade much sooner. You're still losing money, but not that much, and your experience is a lot better.

The other thing that 2-4 way SLI/Crossfire will do is make your computer a crazed processing machine. OpenCL is wild and crazy stuff. You basically can't compete with it as long as you don't care about wattage.

Share this post


Link to post
Doom_user said:

I think the general rule of thumb is that you should only use cf/sli if it's to obtain a level of performance that you can't get with a single card.


True, but with rapidly diminishing returns.

If you really need to get about 40-50% more performance than what the current best single GFX card can give you, and you need it today and you can afford it, sure, SLI/CF is your best friend.

However, once the next gen of cards arrives, or simply when someone manufactures a card which is essentially two of your cards in one, you're just left with an expensive, bulky and energy inefficient setup.

Share this post


Link to post
Maes said:

True, but with rapidly diminishing returns.

If you really need to get about 40-50% more performance than what the current best single GFX card can give you, and you need it today and you can afford it, sure, SLI/CF is your best friend.

However, once the next gen of cards arrives, or simply when someone manufactures a card which is essentially two of your cards in one, you're just left with an expensive, bulky and energy inefficient setup.


You will still have the same "bulky" setup unless you downsize case & motherboard form factor :P. NVIDIA cards are growing increasingly energy efficient with every gen. Compare their 4xx series to their current 9xx. 480 GTXs were power sucking space heaters. Instances of cards literally exploding, Quad SLI melting motherboards and manufacturers adding an additional molex to feed additional power to the PCI-E lanes.

Whereas AMD has gone the polar opposite with their current gen which is comparable to NVIDIA'S 4xx series for power consumption, noise, & heat output.

Additionally if a single top Tier card needs additional oomph once a new gen rolls out; you can get a second card used at a decent price from the people attempting to recoup some cost on the latest. Which can be cheaper than outright buying a single card of the latest gen in some instances.

TL DR; You lose money regardless, it's hardware :D

Share this post


Link to post

Here's an excellent review video on quad SLI with NVIDIA's latest top tier cards. I think it's worth the thread bump to add to the discussion:

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×