The AMD Argument
AMD launched a public salvo against NVIDIA and its GameWorks program this week. We sat down and talked with BOTH sides of this debate to find the real answers.
Earlier this week, a story was posted in a Forbes.com blog that dove into the idea of NVIDIA GameWorks and how it was doing a disservice not just on the latest Ubisoft title Watch_Dogs but on PC gamers in general. Using quotes from AMD directly, the author claims that NVIDIA is actively engaging in methods to prevent game developers from optimizing games for AMD graphics hardware. This is an incredibly bold statement and one that I hope AMD is not making lightly. Here is a quote from the story:
Gameworks represents a clear and present threat to gamers by deliberately crippling performance on AMD products (40% of the market) to widen the margin in favor of NVIDIA products. . . . Participation in the Gameworks program often precludes the developer from accepting AMD suggestions that would improve performance directly in the game code—the most desirable form of optimization.
The example cited on the Forbes story is the recently released Watch_Dogs title, which appears to show favoritism towards NVIDIA GPUs with performance of the GTX 770 ($369) coming close the performance of a Radeon R9 290X ($549).
It's evident that Watch Dogs is optimized for Nvidia hardware but it's staggering just how un-optimized it is on AMD hardware.
Watch_Dogs is the latest GameWorks title released this week.
I decided to get in touch with AMD directly to see exactly what stance the company was attempting to take with these kinds of claims. No surprise, AMD was just as forward with me as they appeared to be in the Forbes story originally.
The AMD Stance
Central to AMD’s latest annoyance with the competition is the NVIDIA GameWorks program. First unveiled last October during a press event in Montreal, GameWorks combines several NVIDIA built engine functions into libraries that can be utilized and accessed by game developers to build advanced features into games. NVIDIA’s website claims that GameWorks is “easy to integrate into games” while also including tutorials and tools to help quickly generate content with the software set. Included in the GameWorks suite are tools like VisualFX which offers rendering solutions like HBAO+, TXAA, Depth of Field, FaceWorks, HairWorks and more. Physics tools include the obvious like PhysX while also adding clothing, destruction, particles and more.
However, there are some major differences with GameWorks compared to previous vendor-supplied code and software examples. First, AMD and several game developers claim that GameWorks is a “black box” with only API calls used to access the GW functionality. The term “black box” is used to indicate that little is known about what is going on inside the GameWorks libraries themselves. This is because GameWorks is provided as a set of libraries, not as a collection of example code (though this is debated by NVIDIA later in our story). Because of its black box status, game developers are unable to diagnose buggy or slow game code when using GameWorks and that can lead to issues with different hardware.
A section of AMD's developer website with example code for download.
You might be wondering already why this is different than something like PhysX? Looking at GPU accelerated PhysX only, that particular plugin ONLY runs on NVIDIA hardware. Adding it or changing the implementation does not negatively affect the performance of the AMD or non-PhysX code path. Many of the GameWorks toolsets (basically everything except PhysX and TXAA) though do in fact run on both AMD and NVIDIA hardware if they are implemented by the developer. That means that visual effects and code built directly by NVIDIA is being used on AMD GPUs in GameWorks enabled titles like Watch_Dogs. You can see immediately why this could raise some eyebrows inside AMD and amongst the most suspicious gamers.
This is different than what has been the norm for many years. In the past, both AMD and NVIDIA have posted code examples on their websites to demonstrate new ways of coding shadows, ambient occlusion and other rendering techniques. These could be viewed, edited and lifted by any and all game developers and implemented into their game engine or into middleware applications. GameWorks is taking this quite a bit further by essentially building out a middleware application of its own and licensing it to developers.
The obvious concern is that by integrating GameWorks with this “black box” development style, NVIDIA could take the opportunity to artificially deflate performance of AMD graphics cards in favor of GeForce options. That would be bad for AMD, bad for AMD users and bad for the community as a whole; I think we can all agree on that. AMD points to Watch_Dogs, and previous GameWorks titles Batman: Arkham Origins, Call of Duty: Ghosts and Assassin’s Creed IV: Black Flag as evidence. More interestingly though, AMD was able to cite a specific comparison between its own TressFX hair library and HairWorks, part of the library set of GameWorks. When using TressFX both AMD and NVIDIA hardware perform nearly identically, even though the code was built and developed by AMD. Using HairWorks though, according to numbers that AMD provided, AMD hardware performs 6-7x slower than comparable NVIDIA hardware. AMD says that because TressFX was publicly posted and could be optimized by developers and by NVIDIA, its solution provides a better result for the gaming community as a whole. Keep in mind this testing was done not in a real-world game but in internal testing.
NVIDIA's updated Developer site.
AMD has made other claims to support its theory on the negative impact of GameWorks. The fact that NVIDIA’s previously available DirectX 11 code samples were removed from NVIDIA developer site is a damning statement as it would indicate NVIDIA’s desire to move away from them to GameWorks exclusively. But, as we show on the following page, these code samples were not removed but simply relocated into a difference section of NVIDIA's developer site.
With all of this information it would be easy to see why stories like the one at Forbes found their way across the Internet. However, as I found out after talking with NVIDIA as well, there is quite a bit more to the story.
AMD blows. They always have
AMD blows. They always have driver issues and quality control problems. Plus their new stuff runs hot as hell. Nvidia runs cool, quiet, WITHOUT WC, and their driver teams are superior.
Crossfire was a disaster for AMD and took years to fix.
AMD should stop whining. They were crowing about Mantle for months and weren’t lifting a finger to use it Nvidia hardware.
oh wow, this too, I should
oh wow, this too, I should have added the whole crossfire garbage.
I feel like I don’t want to analyse anything, just based purely on the past performance of these two companies AMD(really ATI) and the VESA (video electronics standards association) …I have zero trust in their capabilities.
Nvidia is far from perfect, but…
Who cares, watchdogs is an
Who cares, watchdogs is an unoptimised turd. Even when Nvidia had its hand in the game, it needs sli titans just to pump out 60fps at ultra @ 1080p. It is pathetic, the game looks like crap compared to other AAA titles, get your shit together Nvidia
Ignoring everything about
Ignoring everything about this, and purely judging on past performance, why should I trust something solely on the fact that it was adopted by the VESA as a standard?
In my opinion that group is incompetant and slow, and has been a failure. So just purely from that standpoint, I would want to try something different… now about the reality of which of these standards is better or worse, that’s another story…