Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Sign in to follow this  
pritch

Do you SLI / Crossfire? Is it dead??

Recommended Posts

I just surprised myself by picking up a second R9 270 on ebay (already have one that was carried over in to my new build earlier this year).

I was initially just interested in seeing what it would sell for (hardware prices are a bit odd post-Brexit) but as it was kinda cheap, I bid on it and picked it up.

I've never given Crossfire a try and always advised others against it. But part of me just wants to tick the box for my own experience, despite tales of woe from pretty much all of the internet. My PSU and case can handle two of these, but probably not two of anything beefier. I don't really want to buy another single card until the newer gen AMD stuff comes out and settles in, so I figured this might give me a little boost to tide me over until then.

Part of me wonders if I won't just be selling one of them on again in a month's time anyway - so dubious are reports of recent times for this tech. Tiny market shares and a lack of support for many titles (including Doom 4 still?!)

Does anyone else SLI or Crossfire, perhaps with more powerful cards? What's your experience and best and worst experiences re: games?

Share this post


Link to post

Other than some games not taking advantage of it, there's not really any issues that I've had myself with Crossfire.
Something to keep in mind is that Crossfire is as far as I know used for single cards with multiple cores as well.

Share this post


Link to post

The problem with Scan-Line Interleave is still the same as it was since the Voodoo days - every so often one of them goes out of sync with the other. The fix , or so they claim, are special monitors with proprietary synchronization circuitry. Woo. (I still want a gsync monitor tho)

I've also heard the craziest stuff with programs just acting... odd. For example, This happened just oh my mate's computer: things as simple as MS Paint will mess up, The Witcher 3 had polygons flickering in and out of view, Battlefield would either crash or get slower and slower until the frame rates were unplayable. He ripped the other card out, no problems. Star Wars: Battlefront ran.. OK, but I don't think that even uses SLI.

Even better, Doom3 (original version) refuses to run, at all. the BFG edition stutters. The GPL source port runs just fine, though.

EDIT: Not that it's BAD but just one big card is amazingly enough right now. throw in a 1080 and call it a day I guess?

Share this post


Link to post

2 Fury X's here with a 4.6GHz fx8350 and a 144hz 1440p freesync monitor. If a game has support for multiple CPU cores and a good crossfire profile, then it's silky smooth high Hz gameplay. I used to have to fight drivers and random crashing with my old 6950s, but the Fury X's have given no issues beyond unfinished games being released.

Multi-gpu setups are still useful for high resolution/refresh rate monitors.

Csonicgo said:

The problem with Scan-Line Interleave is still the same as it was since the Voodoo days - every so often one of them goes out of sync with the other. The fix , or so they claim, are special monitors with proprietary synchronization circuitry. Woo. (I still want a gsync monitor tho)


AMD have done a lot of work since the HD7000 series on their frame variance, though it still usually takes a few driver and patch releases for new games to sort some issues. I rarely buy full price anyway. Freesync/Gsync is bloody awesome.

Share this post


Link to post

Well... it's in.

BIOS confirmed both recognised and running at 8x each. Bit of screen flicker when Windows loaded but it seemed to settle and Radeon settings sees both cards are linked.

HWMonitor doesn't see the second one though, not sure if that's a limitation or not..

Now, off I go in hope to play...

Share this post


Link to post

I picked up a 4GB GTX 770 for £110 on ebay. It's massively too big for SLI at 300mm long with three fans. I had considered buying two smaller GPUs to use in SLI (the TDP of two smaller cards would not be much worse than the lone 770) but I've read about loads of problems with games not working properly, games not supporting it and SLI/Crossfire being effectively stillborn.

Another thing that bothers me is the slots on my motherboard (a Ranger Maximus 8). There appear to be two GPU slots. One is PCI x16, the other PCI x8. I'm under the impression that the x16 slot is "quicker".

If I fitted two identical GPUs in these slots, would this have an impact on performance or compatibility?

Excuse the noobishness of these questions. EDIT: sorry if this is thread-jacking!

Share this post


Link to post

I think you're probably better off with the one card tbh. It was only worth me trying it because a) I already had one relatively modest 2 year old card and adding the second one was cheap, and b) running two such cards is lesser than or equal to the TDP of a more expensive single card, and c) as explained, I would rather wait to buy a single newer gen card.

The two cards should run at 8x each which to all intents and purposes is equal to running one at 16x. Above 16x combined isn't possible. From what I can understand, there is negligible performance hit in running two in 8x mode vs one in 16x mode.

Performance issues are much more likely to be caused by poor support implementation (or lack of it altogether).

Another practical issue is heat dissipation - in the testing I've done so far on a game that doesn't fully push the card(s) I've recorded a max temp of 65°C in single-card config, 67°C when the second card is installed but deactivated (fortunately can be easily set for individual games' profiles in Radeon settings, in case of issues), and 73°C & 67°C for cards 1 and 2 respectively when running in crossfire. Doesn't sound like a huge increase but is getting that much closer to the thermal limit for this card. Certainly limits overclocking, though the benefits of crossfire at stock speeds should easily outweigh the marginal overclocking abilities of a single card in real world scenarios anyway. Another issue is noise - the extra few degrees of heat means fans running close to 100% on card 1 versus 50-60% when the other card isn't active. It's still not loud by any means and the fans on these Asus cards seem good, but in quieter periods of gameplay it can definitely be noticed. I don't see how card 1 can not be subject to this kind of penalty in any kind of regular air-cooled case - instead of a pocket of relatively cool air above the PSU, there's the second card just a few centimetres below, acting like a radiator, with hot air rising.

One of the big issues with SLI / Crossfire is this fundamentally outdated physical configuration of the ATX layout. PCI was never intended for such things, and with AGP and PCI-E it's basically just been an ongoing workaround through card design. My board in theory supports 3-way Crossfire but that is pretty much ridiculous.

In an ideal world the cards would go in vertically with an ATX interface at right angles, and then you could look to less compromised air cooling solutions eg. 120mm fan underneath, with heatsinks running all the way to vents the top of the case past the CPU maybe...

Share this post


Link to post

In SLI, is it true that the graphic cards simply share the burden rather than combine their strength?

It would seem that there's potentially more to go wrong and this might outweigh any benefit.

Wouldn't the casual gamer wanting to get into SLI be better off with two 1050ti cards or something effective and cheap?

Share this post


Link to post
MajorRawne said:

In SLI, is it true that the graphic cards simply share the burden rather than combine their strength?

It would seem that there's potentially more to go wrong and this might outweigh any benefit.


Kind of. Typically, Direct X 11 and older titles alternate rendering frames between each card which means they each have to access their own memory (using two Fury X's which have 4GB of HBM memory doesn't mean I now have 8GB of memory in these titles for instance). Certain new/upcoming titles under the new Vulkan and Direct X 12 APIs like Ashes of the Singularity have modern multi-gpu techniques that can let Radeons work with Geforces (or your integrated gpu) and share memory pools. Lots of work for developers, though.

Wouldn't the casual gamer wanting to get into SLI be better off with two 1050ti cards or something effective and cheap?


SLI or Crossfire is never recommended for a casual gamer. Besides, I think SLI is only on the 1070 and up this generation for the Nvidia camp.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  
×