Rumor: Nvidia Pressured Ubisoft Into Removing DX10.1 from Assassin's Creed
The Nvidia-branded Assassin’s Creed has had some trouble running its DX10.1 features on Nvidia cards, while ATI cards seemed to manage the effects just fine. Ubisoft has released a statement saying they’ll be releasing a patch which will remove support for DX10.1 until they “rework its implementation.”
ATI’s Developer Relations team, however, made a statement when the game was introduced: “Ubisoft [is] at the forefront of technology adoption, as showcased with the fantastic Assassin’s Creed title. In this instance our developer relations team worked directly with the developer and found an area of code that could be executed more optimally under DX10.1 operation, thus benefiting the ATI Radeon HD 3000 Series.”
Naturally, rumors have begun springing up that Nvidia had threatened Ubisoft into removing support of DX 10.1. TG Daily has been investigating, with an anonymous DX 10.0 game developer close to Ubisoft telling them, “Felt you might want to hear this out. Read the explanation and laughed hard … the way how DX10.1 works is to remove excessive passes and kill overhead that happened there. That overhead wasn’t supposed to happen – we all know that DX10.0 screwed AA in the process, and that 10.1 would solve that [issue]. Yet, even with DX10.0, our stuff runs faster on GeForce than on Radeon, but SP1 resolves scaling issues on [Radeon HD 3800] X2.”
Even with other evidence of various things, TG Daily leaves it up to you, the reader, to make a decision as to what really is behind all of this. But it’s clear that the gamers are the ones losing out, as you’re either getting a game that doesn’t function properly (Nvidia owners) or losing functionality because the developer couldn’t iron things out during the development cycle (ATI/AMD owners).
via Evil Avatar