Everything We Know about the NVIDIA GTX 680

NVIDIA’s GTX 680 may have arrived almost three months later than its direct competitor, the Radeon HD 7970, but it’s faster, quieter, cooler, and at $499.00, fifty dollars cheaper. It can also jump, color, and spell better than its Radeon adversary.

We’ve been blown away by the GTX 680, which has claimed the title of the highest performing single-GPU card, and have compiled this compendium of information on its ins and outs.

Let’s begin with a teaser trailer showcasing NVIDIA’s famous tech demo character, Dawn:

Notice the fine hairs on her cheek? She’s come a long way since 2002.


Table of Contents

  1. Kepler GPU Architecture
  2. GPU Boost & Adaptive V-Sync
  3. Antialiasing Solutions
  4. SLI & Multiple Monitor Gaming
  5. Benchmarking in Brief
  6. Specifications

Kepler GPU Architecture

The GTX 680 comes with NVIDIA’s new Kepler architecture, which has been designed for optimal performance per watt and maximum performance in the latest DirectX 11 games. The GPU consists of four GPCs, eight next-generation Streaming Multiprocessors (SMX), and four memory controllers. Each GPC has a dedicated raster engine and two SMX units, resulting in a total of 1536 CUDA Cores for this card — triple that of the GTX 580, which has 512. With an operating data rate of 6008MHz, the GTX 680 offers the highest memory clock speeds of any GPU and is the industry’s first 6Gbps GDDR5 product.

The Kepler SMX streaming multiprocessor delivers twice the performance per watt relative to the previous generation’s Fermi SMX. Using an ultra-wide design with 192 CUDA cores, the GTX 680 outperforms the Fermi-based GTX 580. While most high-end graphics cards require 8-pin and 6-pin PCI-E connectors, the GTX 680 only requires two 6-pin connectors because the card draws no more than 195 watts of power, 20% less than the 244 watts required by the GTX 580.

The result? The 680 runs cooler, quieter, and consumes less power. It is the highest performing GPU to date, and it is also the most efficient in terms of power consumption.

At GDC 2011, it took three GTX 580s to run Epic’s Samaritan demo (above), consuming 732W of power, generating 2500 BTUs of heat and 51 dBA of noise. This year’s GDC saw a single GTX 680 do the same job for less, consuming only 195W of power, generating 660 BTUs of heat and 46 dBA of noise.

Join the Conversation   

* required field

By submitting a comment here you grant GameFront a perpetual license to reproduce your words and name/web site in attribution. Inappropriate or irrelevant comments will be removed at an admin's discretion.

3 Comments on Everything We Know about the NVIDIA GTX 680

SXO

On March 27, 2012 at 6:17 am

I just want to address one part of this article:
“At GDC 2011, it took three GTX 580s to run Epic’s Samaritan demo (above), consuming 732W of power, generating 2500 BTUs of heat and 51 dBA of noise. This year’s GDC saw a single GTX 680 do the same job for less, consuming only 195W of power, generating 660 BTUs of heat and 46 dBA of noise.”
This is just marketing babble. The truth is the Samaritan demo (Unreal Engine 3.5) doesn’t actually NEED all three GTX 580′s. The demo was as yet unoptimized when they first showed it off last year, and neither Epic nor nVidia wanted to take any chances with hitches during its showcase, hence they over-prepared. I believe in its current state, a single GTX 580 could handle the demo just fine using FXAA, but then how would nVidia convince you that you need a GTX 680? Don’t get me wrong, I’m a fan of the 680, but I believe it has enough real strengths to justify it without resorting to marketing half-truths.

The real question is how is Unreal Engine 4.0 going to look and perform when it’s revealed at E3 this year?

Heru

On March 27, 2012 at 10:05 am

C’mon people, if you’ve paid any attention to how this works you already know whats gonna happen here. AMD beats Nvidia to market, prices the 79xx high cause theres no competition, Nvidia relases the 680 for slightly less then within the month AMD drops the 7950/70 to less then the 680, and releases the 7990 for more. Its gone that exact same way for 3 generations of cards now, don’t see why this will be any different.

CJ Miozzi

On March 27, 2012 at 10:05 am

@SXO:

Thanks for pointing that out! May I ask where you learned that?

I can’t wait to drool over Unreal Engine 4.0… Hell, I can’t wait to play Unreal Tournament 4.