NVIDIA GTX 780 Review: GK110 on a Diet

GTX 780 Design

The GTX 780 uses a GK110, just like the GTX Titan, but it also takes other cues from its older brother as well. The cooler design on the GTX 780 is very similar to what you’ll find on the Titan: A copper vapor chamber and aluminum heatsink, coupled with a blower-style fan. The copper chamber seems to be the new addition here, as the aluminum heatsink and the centrifugal-style blower fan have appeared in previous Nvidia cards.

The GTX 780 looks like the Titan, too, with the exposed aluminum on the top, the etched “GTX 780″ near the bracket, and the light-up GeForce GTX logo on the side. If all Nvidia reference boards look like this going forward, I’d love it. The exposed metal is considerably more attractive than the usual plastic you find on such a board. That said, whatever model you might buy down the road will likely be equipped with an aftermarket cooler design, effectively neutralizing this new style.

The 780 covers outputs as well as anyone could want, unless you really want six DisplayPort outputs on one card. Two dual-link DVI outputs, one full-size DisplayPort, and one full-size HDMI port should cover whatever display(s) you’re running.

And, per usual, the GTX 780 grabs power via one 6-pin and one 8-pin PCI Express power connector — same as the GTX 680 and GTX 580.

What else is Nvidia launching?

Before we get to the numbers, let’s talk about the software that’s launching with the GTX 780.

GPU Boost 2.0: There’s a new version of GPU Boost shipping with the GTX 780. Where the original GPU Boost uses power to dictate clock boosts, version 2.0 focuses on temperature. Out of the box, the GTX 780 is set to hit 80 degrees Celsius, and no higher. Through GPU Boost 2.0, however, you can set the temperature ceiling to 85 degrees, or lower it to, say, 60 degrees. The card will then push itself to a maximum frequency without the chip exceeded the set ceiling.

GeForce Experience: Is coming out of open beta. Version 1.5 will be the first version without the beta tag, but the game is the same. Set your game settings as you normally would, then use GeForce Experience to see if Nvidia’s software can make any tweaking suggestions.

ShadowPlay: This new software is aimed directly at the FRAPs and YouTube crowd. ShadowPlay is Nvidia’s new game footage capture software, allowing users to record up to 20 minutes of gameplay. Since each Kepler-equipped Nvidia card has a built-in H.264 encoder, the card handles all the work, using minimal CPU power in the process. The footage output is 1080p at 30fps — this won’t be changeable at launch (no slick 24p, I’m afraid). Being baked into every Kepler card has its advantages, but ShadowPlay does lack the features found in the paid version of FRAPS.

Adaptive Temperature Control: Nvidia says its software tweaks to the fan controller result in a more steady fan speed. This new adaptive control keeps fan speed fluctuations to a minimum, which should make for an overall quieter experience. While I’m not doing any scientific noise tests on the GTX 780, noise in the test room sits at 42-45 dB when the card is running benchmarks and 38-40 when the card is idle.

Join the Conversation   

* required field

By submitting a comment here you grant GameFront a perpetual license to reproduce your words and name/web site in attribution. Inappropriate or irrelevant comments will be removed at an admin's discretion.

9 Comments on NVIDIA GTX 780 Review: GK110 on a Diet

Luther

On May 23, 2013 at 10:45 am

I have a 560ti sli setup so i should probably wait i take it?

Jay

On May 23, 2013 at 11:52 am

“On paper, the GTX 780 sits right between the GTX Titan and the GTX 680, borrowing heavily from the former while approaching the price of the latter.”

That last part is backwards. $400 less (after taxes) for literally 90-105% of the Titan’s performance, a $1,000 card. The only substantial difference is the 6GB’s of RAM, which few people will use.

This card is a slap in the face to everyone who bought a Titan (myself included) just three months ago. What was marketed as a cutting-edge flagship card turned out to be a clever way to price-gouge the early adopters. Only this time all that money went straight into Nvidia’s pockets instead of a 3rd party.

Devin Connors

On May 23, 2013 at 12:11 pm

@Luther

I would wait for now, yes. The GTX 780 would give you better numbers than your current setup, but if you’re happy with where your framerates are right now, $650 is a lot to spend for a small to modest bump.

@Jay
I was speaking purely to the price with that last bit ($650 is closer to $400 than $1,000), but performance-wise you’re absolutely right. The 780 is closer to the Titan than anything else in Nvidia’s lineup, at least in the single GPU category.

-Devin

Jay

On May 23, 2013 at 1:48 pm

@Devin Connors

Apparently I don’t know how to read, you were right the first time, I somehow mixed them up when reading the article. It’s been a long day, been at work since 3am PST. Sorry about that.

Canukk

On May 23, 2013 at 6:24 pm

I am an owner of 2 gtx580′s and did decie to go with a single GTX780 . I bought the EVGA superclocked http://www.tigerdirect.ca/applications/SearchTools/item-details.asp?EdpNo=8181477&CatId=7387 . My reasons ..My 2 GTX 580′s are HOT!! cards. Not hot as in stolen but hot as in hard to cool. In the warm summer its a struggle to keep the top card at 80c maximum. I have had them go to almost 90c.. That’s with an Antec 1200 case, 7 fans and good cable management AND the gpu fans set to 75% in msi afterburner.

+ 2 GTX 580s when heavy gaming draw a lot of power. My system draws 700+w total when gaming heavily and i am comfortable saying the 2 gtx580′s are responsible for at least 500 of them. the 780 will top out max at 250w if i even get there + it will run much cooler then the 580s since it has a better cooling system , smarter software and being a single card.

I know the benchmarks show the GTX 580 sli trumping the GTX 780 in frame rates in some games..but any frame rate with sli below 60 (refresh rate of monitor) is begging for micro stutter ..50 fps on a single card is still glass smooth..

IMO even with 2 GTX 580′s in sli the saving in heat production, savings in power consumption (which i feel is quite significant) and the losing that annoying below 60 fps micro stutter that plagues dual card setups , the benefit of smooth game play even at lower fps and not really sacrificing much over all in the way of performance is convincing enough for me to have made the purchase.

But no worries. The gtx 580′s are going to be split and re purposed into 2 other computers. My living room pc and my office pc are going to get the 580′s and i will be looking to sell the GTx 460′s i currently have in those systems.

sticksterZs

On May 24, 2013 at 8:06 am

i bought NVID gtx 465 sli 3d ready and 3d theat.surr.sound enable SLIfull supportedw/ thr 1 gb mem. VID THATS pRETTY MUCH THE SAME AS WHAT YOUR SAYING ABOUT THE GTX 780 BUT IONLY I PAID a whooping $750.00 + tax and shipping four mine. heres the specs Nvivdia says to be true about my card.And get this! mine wont power up with out two 6-pin power connectors plug in…. Go figg. ;0 SpecificationsNote: The below specifications represent this GPU as incorporated into NVIDIA’s reference graphics card design. Graphics card specifications may vary by Add-in-card manufacturer. Please refer to the Add-in-card manufacturers’ website for actual shipping specifications.

GPU Engine Specs:Graphics card version GTX 460
1GB GDDR5 GTX 460 v2
1GB GDDR5 GTX 460
768MB GDDR5 GTX 460 SE
CUDA Cores 336 336 336 288
Graphics Clock (MHz) 675 778 675 650
Processor Clock (MHz) 1350 1556 1350 1300
Texture Fill Rate (billion/sec) 37.8 49.8 37.8 31.2
Memory Specs:Graphics card version GTX 460
1GB GDDR5 GTX 460 v2
1GB GDDR5 GTX 460
768MB GDDR5 GTX 460 SE
Memory Clock (MHz) 1800 2004 1800 1700
Standard Memory Config 1GB GDDR5 1GB GDDR5 768MB GDDR5 1GB GDDR5
Memory Interface Width 256-bit 192-bit 192-bit 256-bit
Memory Bandwidth (GB/sec) 115.2 96.2 86.4 108.8
Feature Support:4.1OpenGLPCI-E 2.0 x16Bus SupportYesCertified for Windows 73D Vision, 3D Vision Surround, CUDA, DirectX 11, PhysX, SLISupported Technologies12-waySLI Options2Display Support:YesMulti Monitor2560x1600Maximum Digital Resolution2048x1536Maximum VGA ResolutionYesHDCPYesHDMITwo Dual Link DVI, Mini HDMIStandard Display ConnectorsInternalAudio Input for HDMIStandard Graphics Card Dimensions:8.25 inches(210 mm)Length4.376 inches(111 mm)HeightDual-slotWidthThermal and Power Specs:104 CMaximum GPU Tempurature (in C)160 WMaximum Graphics Card Power (W)450 WMinimum System Power Requirement (W)6-pin & 6-pinSupplementary Power Connectors1 – NVIDIA 3D Vision Surround requires two or more graphics cards in NVIDIA SLI configuration, 3D Vision glasses and three matching 3D Vision-Ready displays. See http://www.nvidia.com/surround for more information.

2 – A GeForce GTX 460 GPU must be paired with another GeForce GTX 460 GPU (graphics card manufacturer can be different). SLI requires sufficient system cooling and a compatible power supply. Visit SLI-Certified components for more information. so im guessing they rally put the screw to me on that one hu?. AND im not entirely happy with this card, It wont wont run farcry3 hardly choopy …and wont run about 12 other games that i bought this card exspressly for.these games holler for a 9800 ser,and above. so i have no clue what i really need . lamself… i still have a lot to learn … Though i did BIULD my system myself
and costom ordered every thing the mboard cpu case and put it it all togartherme little self…

sticksterZs@yahoo.com if any one canspeak in layman terms for me please feel free…..

Samir

On May 30, 2013 at 1:08 am

Thanks for this article, I was wondering how 580 SLI stacks up against 780. To me it would not be worth to upgrade since I bought 2 580s last fall for $500 with Heatkiller blocks already installed on them. I am happy with my current performance. I guess I will have to waith for Maxwell cards to get my money worth.

maxrob

On June 2, 2013 at 7:52 am

@stickers LOL ur card wont run farcry 3 and it cost 750$ ? what a rip off lol my friend had a geforce 250 and he could run bf3… an my 470 run it maxed out.. and the card costed 250$ a long time ago, seriously how the did u pay 750$ for a geforce 460 ? it was not worth more then 250$ when it came out…

Hackson

On June 8, 2013 at 7:11 am

Why does my browser highlight in pink here?