Everything We Know about the NVIDIA GTX 680


Table of Contents

  1. Kepler GPU Architecture
  2. GPU Boost & Adaptive V-Sync
  3. Antialiasing Solutions
  4. SLI & Multiple Monitor Gaming
  5. Benchmarking in Brief
  6. Specifications

Multiple Monitor Gaming

For gaming at glorious three-monitor 5760 x 1080 resolution, a single Kepler card can do what previously required two cards, not only running a game in three Surround monitors, but also powering a fourth for web surfing, instant messaging, watching videos, consulting walkthroughs, etc.

The Surround technology is demanding, and 3D Vision Surround further increases the load, but a number of titles remain playable on a single GeForce GTX 680 at 5760×1080, using max or near-max graphics settings.

Bezel Correction is a technique that hides sections of the screen behind the monitor’s bezel, giving the illusion that the bezel is part of the game — this works best to give the illusion of being in a cockpit. The result is a continuous image across the displays, but occasionally, HUD elements are hidden behind the bezel. The GTX 680 introduces a customizable hotkey that reveals the elements hidden by the bezel.

SLI

Dutch website Hardware.info got a hold of four GTX 680s to test out SLI configurations and through benchmarking, discovered that the GTX 680 SLI scales “really, really well,” though not quite as well as Radeon HD 7970 Crossfire. However, on a three monitor 5760×1080 setup, the GTX 680 beats the 7970, and this is the only practical use for multiple 680s. With only one monitor, two 680s is overkill. Combining three or four GTX 680s can result in even better performance, but, as is the case with triple- and quad-card configurations, can be hit or miss, sometimes resulting in performance drops.

Join the Conversation   

* required field

By submitting a comment here you grant GameFront a perpetual license to reproduce your words and name/web site in attribution. Inappropriate or irrelevant comments will be removed at an admin's discretion.

3 Comments on Everything We Know about the NVIDIA GTX 680

SXO

On March 27, 2012 at 6:17 am

I just want to address one part of this article:
“At GDC 2011, it took three GTX 580s to run Epic’s Samaritan demo (above), consuming 732W of power, generating 2500 BTUs of heat and 51 dBA of noise. This year’s GDC saw a single GTX 680 do the same job for less, consuming only 195W of power, generating 660 BTUs of heat and 46 dBA of noise.”
This is just marketing babble. The truth is the Samaritan demo (Unreal Engine 3.5) doesn’t actually NEED all three GTX 580′s. The demo was as yet unoptimized when they first showed it off last year, and neither Epic nor nVidia wanted to take any chances with hitches during its showcase, hence they over-prepared. I believe in its current state, a single GTX 580 could handle the demo just fine using FXAA, but then how would nVidia convince you that you need a GTX 680? Don’t get me wrong, I’m a fan of the 680, but I believe it has enough real strengths to justify it without resorting to marketing half-truths.

The real question is how is Unreal Engine 4.0 going to look and perform when it’s revealed at E3 this year?

Heru

On March 27, 2012 at 10:05 am

C’mon people, if you’ve paid any attention to how this works you already know whats gonna happen here. AMD beats Nvidia to market, prices the 79xx high cause theres no competition, Nvidia relases the 680 for slightly less then within the month AMD drops the 7950/70 to less then the 680, and releases the 7990 for more. Its gone that exact same way for 3 generations of cards now, don’t see why this will be any different.

CJ Miozzi

On March 27, 2012 at 10:05 am

@SXO:

Thanks for pointing that out! May I ask where you learned that?

I can’t wait to drool over Unreal Engine 4.0… Hell, I can’t wait to play Unreal Tournament 4.