Everything We Know about the NVIDIA GTX 680


Table of Contents

  1. Kepler GPU Architecture
  2. GPU Boost & Adaptive V-Sync
  3. Antialiasing Solutions
  4. SLI & Multiple Monitor Gaming
  5. Benchmarking in Brief
  6. Specifications

Benchmarking in Brief

Benchmarks taken from a sampling of a dozen sources, including AnandTech, TechPowerUp, TechRadar, Tom’s Hardware, Newegg TV, Eurogamer, Motherboards.org, Nepseeker, GameSpot, and Overclockers Club, HardwareCanucks, paint a concise picture: the GTX 680 is a marked step up from the GTX 580 and generally outperforms its competitor, the Radeon HD 7970.

The benchmark results presented below are summarized from AnandTech’s tests, which used an Intel Core i7-3960X running at 4.3GHz with four 4GB G.Skill Ripjaws DDR3-1867 and an EVGA X79 SLI motherboard on Windows 7 Ultimate 64-bit.

Crysis: Warhead: For maximum frame rates, the Radeon HD 7970 easily outperforms the GTX 680, and the 7950 is nearly tied with the NVIDIA card, depending on the resolution. The GTX 680 outperforms the GTX 580 by 17%. Crysis favors memory bandwidth, which the Radeon HD 7900 series has aplenty but the GTX 680 does not.

For minimum frame rates, the GTX 680 clearly lags behind the 7950 and is only 10% better than the GTX 580.

Metro 2033: The GTX 680 trails the 7970 by a few percent at a resolution of 2560×1600, but is clearly superior to the 7950, performs 34% better than the GTX 580, and is almost able to compete with the GTX 590.

DiRT 3: The GTX 680 maintains a 37% lead over the GTX 580, and beats the 7970 by 6% at 2560 and by greater amounts at resolution decreases.

Total War: Shogun 2: The GTX 680 is the first single-GPU to surpass 30 FPS at 2560. It beats the 7970 by 16% at this resolution and 15% at 1920, while beating the GTX 580 by 51% at 2560 and 63% at 1920. The GTX 590 barely beats the GTX 680 at 2560 and loses at 1920.

Batman: Arkham City: The GTX 680 beats the 7970 by 13% at 2560, the GTX 580 by 34%, and almost attains 60 FPS.

Portal 2: The GTX 680 beats the 7970 by 17% at 2560. With SSAA enabled, it tops the charts, beating the 7970 by 44% and the GTX 580 by 67%, and even surpassing the GTX 590 and the Radeon HD 6990. It is the first single-GPU card to surpass 60 FPS.

Battlefield 3: The GTX 680 beats the 7970 by 28% at 2560 and by continually greater amounts at lower resolutions, while beating the GTX 580 by 48% at most resolutions. Further, it is able to compete with the GTX 590 and the 6990.

Starcraft II: At 2560, the GTX 680 beats the 7970 by 13% and the GTX 580 by 39%.

Skyrim: At 2560, the GTX 680 beats the 7970 by 10%. At lower resolutions, FPS is CPU-limited and the GTX 680 tops the charts.

Civilization V: At 2560, the GTX 680 beats the 7970 by 4%, but loses to the 7970 at 1920. The GTX 680 only significantly improves upon the GTX 580 at 2560, its lead decreasing to 8% by 1920.

Join the Conversation   

* required field

By submitting a comment here you grant GameFront a perpetual license to reproduce your words and name/web site in attribution. Inappropriate or irrelevant comments will be removed at an admin's discretion.

3 Comments on Everything We Know about the NVIDIA GTX 680

SXO

On March 27, 2012 at 6:17 am

I just want to address one part of this article:
“At GDC 2011, it took three GTX 580s to run Epic’s Samaritan demo (above), consuming 732W of power, generating 2500 BTUs of heat and 51 dBA of noise. This year’s GDC saw a single GTX 680 do the same job for less, consuming only 195W of power, generating 660 BTUs of heat and 46 dBA of noise.”
This is just marketing babble. The truth is the Samaritan demo (Unreal Engine 3.5) doesn’t actually NEED all three GTX 580′s. The demo was as yet unoptimized when they first showed it off last year, and neither Epic nor nVidia wanted to take any chances with hitches during its showcase, hence they over-prepared. I believe in its current state, a single GTX 580 could handle the demo just fine using FXAA, but then how would nVidia convince you that you need a GTX 680? Don’t get me wrong, I’m a fan of the 680, but I believe it has enough real strengths to justify it without resorting to marketing half-truths.

The real question is how is Unreal Engine 4.0 going to look and perform when it’s revealed at E3 this year?

Heru

On March 27, 2012 at 10:05 am

C’mon people, if you’ve paid any attention to how this works you already know whats gonna happen here. AMD beats Nvidia to market, prices the 79xx high cause theres no competition, Nvidia relases the 680 for slightly less then within the month AMD drops the 7950/70 to less then the 680, and releases the 7990 for more. Its gone that exact same way for 3 generations of cards now, don’t see why this will be any different.

CJ Miozzi

On March 27, 2012 at 10:05 am

@SXO:

Thanks for pointing that out! May I ask where you learned that?

I can’t wait to drool over Unreal Engine 4.0… Hell, I can’t wait to play Unreal Tournament 4.