Everything We Know about the NVIDIA GTX 680


Table of Contents

  1. Kepler GPU Architecture
  2. GPU Boost & Adaptive V-Sync
  3. Antialiasing Solutions
  4. SLI & Multiple Monitor Gaming
  5. Benchmarking in Brief
  6. Specifications

Specifications

GTX 680 GPU Engine Specs:
CUDA Cores 1536
Base Clock (MHz) 1006
Boost Clock (MHz) 1058
Texture Fill Rate (billion/sec) 128.8
 
GTX 680 Memory Specs:
Memory Speed (Gbps) 6008
Standard Memory Config 2048MB
Memory Interface Width 256-bit GDDR5
Memory Bandwidth (GB/sec) 192.2
 
GTX 680 Support:
OpenGL 4.2
Bus Support PCI Express 3.0
Certified for Windows 7 Yes
Supported Technologies 3D Vision, 3D Vision Surround, CUDA, DirectX 11, PhysX, SLI
SLI Options 3-way
 
Display Support:
Multi Monitor 4 displays
Maximum Digital Resolution 2560×1600
Maximum VGA Resolution 2048×1536
HDCP Yes
HDMI Yes
Standard Display Connectors 1 Dual Link DVI-I, 1 Dual Link DVI-D, 1 HDMI, 1 DisplayPort
Audio Input for HDMI Internal
 
GTX 680 Graphics Card Dimensions:
Length 10.0 inches
Height 4.376 inches
Width Dual-slot
 
Thermal and Power Specs:
Maximum GPU Tempurature (in C) 98 C
Maximum Graphics Card Power (W) 195 W
Min. System Power Requirement (W) 550 W
Supplementary Power Connectors Two 6-pin

Join the Conversation   

* required field

By submitting a comment here you grant GameFront a perpetual license to reproduce your words and name/web site in attribution. Inappropriate or irrelevant comments will be removed at an admin's discretion.

3 Comments on Everything We Know about the NVIDIA GTX 680

SXO

On March 27, 2012 at 6:17 am

I just want to address one part of this article:
“At GDC 2011, it took three GTX 580s to run Epic’s Samaritan demo (above), consuming 732W of power, generating 2500 BTUs of heat and 51 dBA of noise. This year’s GDC saw a single GTX 680 do the same job for less, consuming only 195W of power, generating 660 BTUs of heat and 46 dBA of noise.”
This is just marketing babble. The truth is the Samaritan demo (Unreal Engine 3.5) doesn’t actually NEED all three GTX 580′s. The demo was as yet unoptimized when they first showed it off last year, and neither Epic nor nVidia wanted to take any chances with hitches during its showcase, hence they over-prepared. I believe in its current state, a single GTX 580 could handle the demo just fine using FXAA, but then how would nVidia convince you that you need a GTX 680? Don’t get me wrong, I’m a fan of the 680, but I believe it has enough real strengths to justify it without resorting to marketing half-truths.

The real question is how is Unreal Engine 4.0 going to look and perform when it’s revealed at E3 this year?

Heru

On March 27, 2012 at 10:05 am

C’mon people, if you’ve paid any attention to how this works you already know whats gonna happen here. AMD beats Nvidia to market, prices the 79xx high cause theres no competition, Nvidia relases the 680 for slightly less then within the month AMD drops the 7950/70 to less then the 680, and releases the 7990 for more. Its gone that exact same way for 3 generations of cards now, don’t see why this will be any different.

CJ Miozzi

On March 27, 2012 at 10:05 am

@SXO:

Thanks for pointing that out! May I ask where you learned that?

I can’t wait to drool over Unreal Engine 4.0… Hell, I can’t wait to play Unreal Tournament 4.