dual GPU sets? 24 replies

Please wait...

Bs|Archaon

I would die without GF

50 XP

15th March 2006

0 Uploads

5,910 Posts

0 Threads

#21 11 years ago
xX_CENTURION_Xx;3993036i was thinking about getting a couple 7950GX2s, but i want DX10, and they dont have it.

Once again, get Quad SLI out of your head. Even a 320MB 8800GTS is considerably more powerful than a 7950GX2; not to mention that the 8800GTS is about $150-200 cheaper per card.




Guest

I didn't make it!

0 XP

 
#22 11 years ago

i know, i said was as in past-tense, i know the 8800gts is better




>Omen<

Modern Warfare

50 XP

1st January 2005

0 Uploads

7,395 Posts

0 Threads

#23 11 years ago
Bs|Archaon;3993336Once again, get Quad SLI out of your head. Even a 320MB 8800GTS is considerably more powerful than a 7950GX2; not to mention that the 8800GTS is about $150-200 cheaper per card.

Then again if Nvidia makes an 8950GX2 as some rumors suggest, it will easily spank any 8800 SLI setup and still use just two slots. Problem is these are decisions best made after knowledge of Nvidia's lineup for the coming 6 months or so. Until then it's all speculation.

As I've said before, I prefer a one card solution and think the 9800GTX would play anything fine by itself for some time even on a large display. With all the money Nvidia has invested in SLI though, you gotta wonder if they'll wait until ungodly games are released that can stress a 9800GTX to it's limits before they release such a beast. Otherwise they're SLI sales will no doubt fall off.




Jeff Über Admin

I am a mean boss ⬆️⬆️⬇️⬇️⬅️➡️⬅️➡️??

184,643 XP

6th April 2000

0 Uploads

14,592 Posts

1,534 Threads

#24 11 years ago

>Omen<;3992022I think you're splitting hairs on the definition of SLI/Quad SLI there. Ever since the advent of using two 7950GX2s it has been called quad SLI because 4 GPUs are used simultaneously. There is no need for a slot to create the SLI if it happens at the card. If the game recognizes 4 GPUs employed what difference does it make how the slot handles it, as long as it does so without technical problems?

My point above was why use 4 slots and all that space if two can suffice to use 4 GPUs? With 4 slots you still have just 4 GPUs recognized and used, same difference just more space and expense.

If quad SLI via two dual GPU cards weren't a practical means of utilizing 4 GPUs at once, there wouldn't have been so many manufacturers wanting to build systems around it when it first debuted.

Old news, but just to refresh memories n the subject (from Wikipedia):

"Quad SLI

In October 2005, Gigabyte Technology released the GA-8N SLI Quad Royal. Essentially it was a motherboard with four PCI-Express x16 slots. At the time of release however, NVIDIA stated that it would not be the direction it would take SLI.[3] In early 2006, NVIDIA revealed its plans for Quad SLI. When the 7900GX2 was originally demonstrated, it was with two such cards in an SLI configuration. This is possible because each GX2 has two extra SLI connectors, separate from the bridges used to link the two GPUs in one unit - one on each PCB, one per GPU, for a total of two links per GPU. When two GX2 graphics cards are installed in an SLI motherboard, these SLI connectors are bridged using two separate SLI bridges. (In such a configuration, if the four PCBs were labeled A, B, C, D from top to bottom, A and C would be linked by an SLI bridge, as would B and D.) [COLOR=Red]This way, four GPUs can contribute to performance.[COLOR=Black]*[/COLOR][/COLOR] The newer 7950GX2 omits the external SLI connector on one of its PCBs, meaning that only one SLI bridge is required to run two 7950GX2s in SLI. Quad SLI has yet to show any massive improvements in gaming using the common resolutions of 1280x1024 and 1600x1200, but has shown improvements by enabling 32x anti-aliasing in SLI-AA mode, and support for 2560x1600 resolutions at much higher framerates than is possible with single or dual GPU systems with maximum settings in modern games. NVIDIA has recently released official Quad SLI drivers, marking the first time one can use Quad SLI with official support.[4] For more information, visit NVIDIA's Quad SLI website."

* Any way you slice it, that's SLI on ONE slot.;)

I'm not denying your statement. My point was that when it comes to the 8x00's I find it difficult to believe they would do it the same way they did for the 7x00 series. They would not use the 8800's for this process. Those 7950's just had two boards screwed together using an sli bridge and shared a pci-express slot. While this technology no doubt could be implemented for the G80 chipset my issue, especially from personal experience using them is the sheer amount of heat produced. A single GPU puts out hotter temperatures than a Core2 Extreme (at load a temperature of around 70C on a 8800 is normal while a core2 extreme at 35-40C is normal). I can burn my finger after running a game for over an hour if I touch the back part of my chassis simply from the heat that's transferred from the two screws holding it in place. Implementing them the same way they did the 7900's seems like it would run into a safety and stability issue with the 8800's and the heat output of 2 GPU's that would effectively be sharing one giant heatsink. The only way I could see it happening is a liquid cooling system.

Also, the latest NVIDIA drivers for quite a while have no support for an G80 chipset running in quad sli yet (this is confirmed to be true on the nzone sli forums by an NVIDIA rep)


Product Manager | GameFront.com




&gt;Omen&lt;

Modern Warfare

50 XP

1st January 2005

0 Uploads

7,395 Posts

0 Threads

#25 11 years ago

Ah, I see what you mean now. Well I think first off it depends on if the rumors are true that they'll even make an 8950GX2, and furthermore if the design will be the same or similar. In that rumor there was talk of a G84 GPU being used, but now that Nvidia has confirmed an 8800GT with G92, maybe the 8950 rumors were false and/or scrapped by Nvidia.

I agree totally that the big weak spot of the 7950GX2 was heat buildup on the card with it's GPU and chipset trapped between the 2 cards. Still though, this dual card/GPU idea is intriguing enough to me to wonder if it could be implemented much better.

For instance would it be possible to reverse one of the cards so the backsides of the circuit boards are facing each other. Then place a cooler in that spot between them, with through-board heatsinks. Then put strategically placed holes in the board and ram ducts on the component sides of the boards to effectively draw air and radiated heat off the components via the ducts and sinks?

Perhaps a bit too elaborate to be cost effective, but still I wonder if it's possible. It probably could be done as well with stock coolers after reversing the one card. That would be much bulkier but better cooling. Just reversing the architecture of one card would probably be cost prohibitive though.