Everything We Know about the NVIDIA GTX 680


Table of Contents

  1. Kepler GPU Architecture
  2. GPU Boost & Adaptive V-Sync
  3. Antialiasing Solutions
  4. SLI & Multiple Monitor Gaming
  5. Benchmarking in Brief
  6. Specifications

Antialiasing Solutions

The currently popular antialiasing solution is MSAA, which renders the screen at 200% or higher resolution, then down samples the image. However, this technique chews up video memory exponentially: 2x MSAA doubles the memory usage, 4x MSAA quadruples it, and 8x MSAA octuples it, making it an impractical solution for most gamers seeking solid performance.

The GTX 680 employs a new technique, FXAA, which locates edges in a frame based on contrast detection, then smoothes out the jagged edge through post-processing. FXAA’s ability to smooth edges is comparable or superior to 4x MSAA without the additional memory usage, and it works on transparent geometry, such as foliage. FXAA can be enabled through the NVIDIA Control Panel to work with hundreds of games, regardless of age.

If you’re looking for sheer quality, then the 680′s new TXAA technique, which is designed for direct integration into game engines, comes in two flavors. TXAA 1 offers visual quality on par with 8x MSAA with performance comparable to 2x MSAA, while TXAA 2 offers image quality that’s superior to 8x MSAA, with performance comparable to 4x MSAA.

The following games, engines, and developers have committed to offering TXAA support: MechWarrior Online, Secret World, Eve Online, Borderlands 2, Unreal Engine 4, BitSquid, Slant Six Games, and Crytek.

Join the Conversation   

* required field

By submitting a comment here you grant GameFront a perpetual license to reproduce your words and name/web site in attribution. Inappropriate or irrelevant comments will be removed at an admin's discretion.

3 Comments on Everything We Know about the NVIDIA GTX 680

SXO

On March 27, 2012 at 6:17 am

I just want to address one part of this article:
“At GDC 2011, it took three GTX 580s to run Epic’s Samaritan demo (above), consuming 732W of power, generating 2500 BTUs of heat and 51 dBA of noise. This year’s GDC saw a single GTX 680 do the same job for less, consuming only 195W of power, generating 660 BTUs of heat and 46 dBA of noise.”
This is just marketing babble. The truth is the Samaritan demo (Unreal Engine 3.5) doesn’t actually NEED all three GTX 580′s. The demo was as yet unoptimized when they first showed it off last year, and neither Epic nor nVidia wanted to take any chances with hitches during its showcase, hence they over-prepared. I believe in its current state, a single GTX 580 could handle the demo just fine using FXAA, but then how would nVidia convince you that you need a GTX 680? Don’t get me wrong, I’m a fan of the 680, but I believe it has enough real strengths to justify it without resorting to marketing half-truths.

The real question is how is Unreal Engine 4.0 going to look and perform when it’s revealed at E3 this year?

Heru

On March 27, 2012 at 10:05 am

C’mon people, if you’ve paid any attention to how this works you already know whats gonna happen here. AMD beats Nvidia to market, prices the 79xx high cause theres no competition, Nvidia relases the 680 for slightly less then within the month AMD drops the 7950/70 to less then the 680, and releases the 7990 for more. Its gone that exact same way for 3 generations of cards now, don’t see why this will be any different.

CJ Miozzi

On March 27, 2012 at 10:05 am

@SXO:

Thanks for pointing that out! May I ask where you learned that?

I can’t wait to drool over Unreal Engine 4.0… Hell, I can’t wait to play Unreal Tournament 4.