Nvidia Interview: The Present and Future of GeForce GPU’s
GF: Let’s talk about how graphics can better, at this point. I think even back in the Super Nintendo era, you’d hear people say “graphics can never get better!” I said that, as a kid: “Street Fighter II is never going to look better.” I think people are saying that now, with the 680. Realistically, how can they get better? Are we down to, say, molecules on cheek hair? How are graphics going to actually improve?
JP: One of the demos that you may have seen here is Epic’s Samaritan demo. This is the state-of-the-art of what we can do today in terms of real-time rendering. It was a demo that was released a year ago at GDC, and at the time, they showed it on three GTX 580′s in SLI. This year, we worked with them to show the demo on a single GTX 680. It was performance improvements in the hardware, but also some work that we’ve done with them on the software side in delivering new forms of anti-aliasing that can deliver the same level of image quality, but much faster.
Looking at the demo, the level of realism and the level of fidelity is really stunning. It shows the peak of where games can go in the next few years, and now we actually have a single card that can deliver that. Last year it felt more like a science project, running on this monster of a machine – now you’ll actually see something like that showing up in games in the next year or two.
Beyond that, one of the other things that we’ve been working on and that we’ve shown some technology demos for is ray-tracing. It’s an alternative form of rendering – traditional rendering is called rasterization – but with ray-tracing, you’re actually tracing the path of light through a scene, as it bounces off all the different objects, and resolves into a final color body for a pixel in the scene. You can create images that are effectively photo-realistic. With our GTX 480 lab, we got that to what I would call “interactive speeds,” which are maybe a frame or two per second. We need to get that to about 30 frames per second, so there’s an order of magnitude of performance to make it real time. But as we continue progressing from a performance perspective, you could see, 5-10 years from now, doing real-time ray-tracing for a full scene, and having a photorealistic image.
GF: What’s a practical example of a scene in which that would be happening?
JP: The technology demo, which is actually posted on our website, is called “Design Garage.” We did it with automobiles. We took some really cool automobiles – one of my favorites is a Bugatti – and put them in a showroom. We have the equivalent of a little sun, lighting up the scene, and the reflections off of the shiny materials and the windows are completely realistic.
GF: Which developers are doing the best job unlocking all the PC tools at their disposal?
JP: I mentioned Epic, with their Unreal engine. The Unreal engine is an engine that goes into tons of games; its one of the biggest engines in the world. They’re bringing DirectX 11, they’re bringing PhysX, they’re bringing 3D into their engine, and game developers can leverage that technology into all their games. Borderlands 2 from Gearbox, Max Payne 3 from Rockstar, the DICE guys with Battlefield 3 – all these games are incredibly visually stunning. Skyrim looks spectacular – Bethesda always does a good job. We have a large team of engineers dedicated to helping them with their games, and making them look state-of-the-art. That’s a big focus for us.
GF: What about Witcher 2? That game looks phenomenal. I don’t know how they take that to the 360.
JP: Developers can make games that scale, and I think that’s great. You can get a low-end experience with it, and play it on more affordable hardware, and you can have graphics options, fidelity, and PhysX effects, that allow you a more immersive experience if you got the hardware.
You have games that can run on a console, but with technology like 3D and PhyX, can deliver a better experience on high-end hardware.
GF: They can, but they don’t, a lot of the time.
JP: It’s true. That’s a something for gamers to say that they want, and for us to continue working and investing with developers to help them push forward.
GF: When you built the new 680, did you have environmental- or budget-conscious gamers in mind, in terms of its power consumption?
JP: It was about not having to compromise. I always make the car analogy: you can by exotic cars – your Ferraris or Bugattis or whatever, and you spend all this money to get the very best, but you have to take it into the shop every other week for maintenance. The gas milage is horrible. When you buy these enthusiast products, there’re often sacrifices or compromises you have to make. But if you’re paying the most, why should you have to compromise? With the 680, you can get fast performance, but still have a card that runs cool, runs quiet, is power-efficient – has good gas milage, in other words.
GF: When designing new hardware solutions, is a matter of identifying a technological problem, then designing a fix? Or, rather, is it a matter of finding applications for technological advances that have already occurred.
JP: When we’re designing our next generation GPU’s, we’re talking to the thought-leaders in the industry – game developer thought-leaders – about what they want to do in their next generation games, and what kind problems they have, and what they want solved.
We also have a lot of engineers and scientists who are thinking past what the current problems are, into where they see the future going, and we’re trying to solve those problems as well.
With GTX 480, which was our first DirectX 11 GPU, we really wanted to solve the problem of geometric realism in games. You have a lot of low polygon models, you have characters with pointy heads. You see the geometry in the scene, and it drives you nuts. So we built an architecture that was built around being able to process a ton of geometry, and break that geometry up into pieces using a process called tesselation, which was one of the big features for DirectX 11.
A lot of times, we looked at industries like film. What are people doing offline? How does the film industry get these incredibly realistic scenes? What are they doing from a process perspective, and how do we make that real-time?
Whether it’s film, whether it’s game developers, whether it’s our own internal engineers, we’re looking at problems, and how we can solve those in real-time with our next-generation architecture.
GF: Were any of the design decisions for this most recent card made in response to fan suggestions, or problems? Did you take feedback from the community?
JP: Absolutely. With our Fermi architecture, which was our architecture prior to Kepler, we heard two things: “support more displays on a single GPU,” and also “improve your power efficiency.” We heard those things pretty loud and clear from our customers, and the soul of our Kepler architecture is not just performance, but performance per watt.