Everything We Know about the NVIDIA GTX 680

Table of Contents

  1. Kepler GPU Architecture
  2. GPU Boost & Adaptive V-Sync
  3. Antialiasing Solutions
  4. SLI & Multiple Monitor Gaming
  5. Benchmarking in Brief
  6. Specifications

GPU Boost

One of the GTX 680′s features is GPU Boost, which dynamically increases clock speed to improve performance. Rather than running the GPU at a clock speed based on the most demanding application, GPU Boost monitors power consumption in real time and automatically adjusts the clock speed based on the power consumption of the currently running application. For instance, if a game only requires 150 watts of power, GPU Boost will increase the clock speed by 100 MHz or more from its base of 1006MHz, translating into improved performance.

Based on tests run by AnandTech on a review unit, the GPU Boost increases sequentially, following nine 13MHz intervals to take the card from 1006MHz to 1110MHz, with each interval requiring higher voltage. The card’s target power usage is 170W, and it’ll boost the clock speed to meet that target. However, temperature was found to be a limiting factor: tests showed the card could only maintain 1110MHz if the GPU temperature was below 70C, which is unsustainable on the stock GTX 680. Benchmarks reveal that GPU boost improves performance by an average of 3%, and by no more than 5%.

Rather than take away from overclocking, GPU Boost actually increases your overclocking options. With the 680, overclocking is effectuated by manipulating the GPU Boost’s power target by -30% to +32% and the use of a GPU clock offset. Each factor can be adjusted separately, and the the GPU clock offset has a greater impact than altering the power target, but it is by manipulating both that this card’s true overclocking potential is unlocked.

Adaptive V-Sync

V-Sync is the currently popular solution to screen tearing; however, it causes massive FPS drops when the frame rate dips below the monitor’s refresh rate. With the 680′s new Adaptive V-Sync, V-Sync is automatically disabled while the frame rate is below the monitor’s refresh rate, preventing the dramatic FPS drops typically observed.

Join the Conversation   

* required field

By submitting a comment here you grant GameFront a perpetual license to reproduce your words and name/web site in attribution. Inappropriate or irrelevant comments will be removed at an admin's discretion.

3 Comments on Everything We Know about the NVIDIA GTX 680


On March 27, 2012 at 6:17 am

I just want to address one part of this article:
“At GDC 2011, it took three GTX 580s to run Epic’s Samaritan demo (above), consuming 732W of power, generating 2500 BTUs of heat and 51 dBA of noise. This year’s GDC saw a single GTX 680 do the same job for less, consuming only 195W of power, generating 660 BTUs of heat and 46 dBA of noise.”
This is just marketing babble. The truth is the Samaritan demo (Unreal Engine 3.5) doesn’t actually NEED all three GTX 580′s. The demo was as yet unoptimized when they first showed it off last year, and neither Epic nor nVidia wanted to take any chances with hitches during its showcase, hence they over-prepared. I believe in its current state, a single GTX 580 could handle the demo just fine using FXAA, but then how would nVidia convince you that you need a GTX 680? Don’t get me wrong, I’m a fan of the 680, but I believe it has enough real strengths to justify it without resorting to marketing half-truths.

The real question is how is Unreal Engine 4.0 going to look and perform when it’s revealed at E3 this year?


On March 27, 2012 at 10:05 am

C’mon people, if you’ve paid any attention to how this works you already know whats gonna happen here. AMD beats Nvidia to market, prices the 79xx high cause theres no competition, Nvidia relases the 680 for slightly less then within the month AMD drops the 7950/70 to less then the 680, and releases the 7990 for more. Its gone that exact same way for 3 generations of cards now, don’t see why this will be any different.

CJ Miozzi

On March 27, 2012 at 10:05 am


Thanks for pointing that out! May I ask where you learned that?

I can’t wait to drool over Unreal Engine 4.0… Hell, I can’t wait to play Unreal Tournament 4.