I feel like every few months I get to write more stories focusing on the exact same subject. It's almost as if nothing in the enthusiast market is happening and thus the cycle continues, taking all of us with it on a wild ride of arguments and valuable debates. Late last week I started hearing from some of my Twitter followers that there were concerns surrounding the upcoming release of The Witcher 3: Wild Hunt. Then I found a link to this news post over at Overclock3d.net that put some of the information in perspective.
Essentially, The Witcher 3 uses parts of NVIDIA's GameWorks development tools and APIs, software written by NVIDIA to help game developers take advantage of new technologies and to quickly and easily implement them into games. The problem of course is that GameWorks is written and developed by NVIDIA. That means that optimizations for AMD Radeon hardware are difficult or impossible, depending on who you want to believe. Clearly it doesn't benefit NVIDIA to optimize its software for AMD GPUs financially, though many in the community would like NVIDIA to give a better effort – for the good of said community.
Specifically in regards to The Witcher 3, the game implements NVIDIA HairWorks technology to add realism on many of the creatures of the game world. (Actually, the game includes HairWorks, HBAO+, PhysX, Destruction and Clothing but our current discussion focuses on HairWorks.) All of the marketing and video surrounding The Witcher 3 has been awesome and the realistic animal fur simulation has definitely been a part of it. However, it appears that AMD Radeon GPU users are concerned that performance with HairWorks enabled will suffer.
An example of The Witcher 3: Wild Hunt with HairWorks
One of the game's developers has been quoted as such:
Many of you have asked us if AMD Radeon GPUs would be able to run NVIDIA’s HairWorks technology – the answer is yes! However, unsatisfactory performance may be experienced as the code of this feature cannot be optimized for AMD products. Radeon users are encouraged to disable NVIDIA HairWorks if the performance is below expectations.
There are at least several interpretations of this statement floating around the web. First, and most enflaming, is that NVIDIA is not allowing CD Project Red to optimize it by not offering source code. Another is that CD Project is choosing to not optimize for AMD hardware due to time considerations. The last is that it simply isn't possible to optimize it because of hardware limitations of HairWorks.
I went to NVIDIA with these complaints about HairWorks and Brian Burke gave me this response:
We are not asking game developers do anything unethical.
GameWorks improves the visual quality of games running on GeForce for our customers. It does not impair performance on competing hardware.
Demanding source code access to all our cool technology is an attempt to deflect their performance issues. Giving away your IP, your source code, is uncommon for anyone in the industry, including middleware providers and game developers. Most of the time we optimize games based on binary builds, not source code.
GameWorks licenses follow standard industry practice. GameWorks source code is provided to developers that request it under license, but they can’t redistribute our source code to anyone who does not have a license.
The bottom line is AMD’s tessellation performance is not very good and there is not a lot NVIDIA can/should do about it. Using DX11 tessellation has sound technical reasoning behind it, it helps to keep the GPU memory footprint small so multiple characters can use hair and fur at the same time.
I believe it is a resource issue. NVIDIA spent a lot of artist and engineering resources to help make Witcher 3 better. I would assume that AMD could have done the same thing because our agreements with developers don’t prevent them from working with other IHVs. (See also, Project Cars)
I think gamers want better hair, better fur, better lighting, better shadows and better effects in their games. GameWorks gives them that.
Interesting comments for sure. The essential take away from this is that HairWorks depends heavily on tessellation performance and we have known since the GTX 680 was released that NVIDIA's architecture performs better than AMD's GCN for tessellation – often by a significant amount. NVIDIA developed its middleware to utilize the strength of its own GPU technology and while it's clear that some disagree, not to negatively impact AMD. Did NVIDIA know that would be the case when it was developing the software? Of course it did. Should it have done something to help AMD GPUs more gracefully fall back? Maybe.
Next, I asked Burke directly if claims that NVIDIA was preventing AMD or the game developer from optimizing HairWorks for other GPUs and platforms were true? I was told that both AMD and CD Project had the ability to tune the game, but in different ways. The developer could change the tessellation density based on the specific GPU detected (lower for a Radeon GPU with less tessellation capability, for example) but that would require dedicated engineering from either CD Project or AMD to do. AMD, without access to the source code, should be able to make changes in the driver at the binary level, similar to how most other driver optimizations are built. Burke states that in these instances NVIDIA often sends engineers to work with game developers and that AMD "could have done the same had it chosen to." And again, NVIDIA reiterated that in no way do its agreements with game developers prohibit optimization for AMD GPUs.
It would also be possible for AMD to have pushed for the implementation of TressFX in addition to HairWorks; a similar scenario played out in Grand Theft Auto V where several vendor-specific technologies were included from both NVIDIA and AMD, customized through in-game settings.
NVIDIA has never been accused of being altruistic; it doesn't often create things and then share it with open arms to the rest of the hardware community. But it has to be understood that game developers know this as well – they are not oblivious. CD Project knew that HairWorks performance on AMD would be poor but decided to implement the technology into The Witcher 3 anyway. They were willing to sacrifice performance penalties for some users to improve the experience of others. You can argue that is not the best choice, but at the very least The Witcher 3 will let you disable the HairWorks feature completely, removing it from the performance debate all together.
In a perfect world for consumers, NVIDIA and AMD would walk hand-in-hand through the fields and develop hardware and software in tandem, making sure all users get the best possible experience with all games. But that style of work is only helpful (from a business perspective) for the organization attempting to gain market share, not the one with the lead. NVIDIA doesn't have to do it and chooses to not. If you don't want to support that style, vote with your wallet.
Another similar controversy surrounded the recent release of Project Cars. AMD GPU performance was significantly lower than comparable NVIDIA GPUs, even though this game does not implement any GameWorks technologies. In that case, the game's developer directly blamed AMD's drivers, saying that it was a lack of reaching out from AMD that caused the issues. AMD has since recanted its stance that the performance delta was "deliberate" and says a pending driver update will address gamers performance issues.
All arguing aside, this game looks amazing. Can we all agree on that?
The only conclusion I can come to from all of this is that if you don't like what NVIDIA is doing, that's your right – and you aren't necessarily wrong. There will be plenty of readers that see the comments made by NVIDIA above and continue to believe that they are being at best disingenuous and at worst, are straight up lying. As I mentioned above in my own comments NVIDIA is still a for-profit company that is responsible to shareholders for profit and growth. And in today's world that sometimes means working against other companies than with them, resulting in impressive new technologies for its customers and push back from competitor's customers. It's not fun, but that's how it works today.
Fans of AMD will point to G-Sync, GameWorks, CUDA, PhysX, FCAT and even SLI as indications of NVIDIA's negative impact on open PC gaming. I would argue that more users would look at that list and see improvements to PC gaming, progress that helps make gaming on a computer so much better than gaming on a console. The truth likely rests somewhere in the middle; there will always be those individuals that immediately side with one company or the other. But it's the much larger group in the middle, that shows no corporate allegiance and instead just wants to have as much fun as possible with gaming, that will impact NVIDIA and AMD the most.
So, since I know it will happen anyway, use the comments page below to vent your opinion. But, for the benefit of us all, try to keep it civil!
Gameworks is ebola and HIV
Gameworks is ebola and HIV infused cancer slowly killing pc gaming.Blinded by your loyalties , some of you cant see that.
Technically, GayWorks in all
Technically, GayWorks in all actuality is basically “PhysX 2.0” of sorts, so if you wanna blame it then simply say “PhysX sucks”, lol.
Gameworks is not just PhysX
Gameworks is not just PhysX retard and you wonder why AMD gets such a bad rep fools like you.
https://developer.nvidia.com/content/introducing-nvidia-gameworks
Gameworks is not just PhysX
Gameworks is not just PhysX retard and you wonder why AMD gets such a bad rep fools like you.
https://developer.nvidia.com/content/introducing-nvidia-gameworks
Gameworks is not just PhysX
Gameworks is not just PhysX retard and you wonder why AMD gets such a bad rep fools like you.
https://developer.nvidia.com/content/introducing-nvidia-gameworks
Obviously you are the
Obviously you are the retarded one here. Nice multi-posting, kiddo. Clearly shows your mental age.
FailWorks looks like
FailWorks looks like completely horrendous garbage compared to the TressFX 2.0 and up, so I say good riddance. That shit’s crap anyway. TressFX 3.0 is a much better hair/fur/whatever technology and in combination with the HBM-induced R9 39x cards it will look great AND perform like a true beast at the same time, not even mentioning future benefits of utilizing Radeons in combinations with Mantle/Mantle 2.0, OGL Vulkan and/or DirectX 12. Before anyone bitches that I’m an AMD/Radeon shill, please remember that Geralt 3 runs on DirectX 11 by default, which is a piece of awful trash to begin with, regardless of FailWorks or not, regardless of noVideo’s asinine acts or not. My point is – until Geralt 3 gets an update of sorts for it’s main API (if I remember it correctly, CD Projekt did say at one point that they can add full support for DX 12 and OGL Vulkan to Geralt 3 with future patches) we’ll have s badly performing game on our hands either way, regardless of which GPU side you’re using. Let’s wait for the “Enhanced Edition”, I say.
Enjoy your “Hoping &
Enjoy your “Hoping & Dreaming” like all the AMD ignorant stupid gamers out there, you’re the minority and no company would invest in it or you because you are less than the 25% GPU market share.
As of now enoy your sub-par pc gaming experience 😉
Excuse me, did you say
Excuse me, did you say something? Because all I’ve heard is just butthurt shiller’s yappity-yap-yapping.
Excuse me, did you say
Excuse me, did you say something? Because all I’ve heard is just butthurt shiller’s yappity-yap-yapping.
LOL you must be 10 for
LOL you must be 10 for writing that rubbish shit
Obvious AMD fanboy lol. FYI
Obvious AMD fanboy lol. FYI TressFX no matter what version they all look unrealistic and ugly compared to hairworks. You are so blind to see it you only see the KoolAid.
Said the NVidia Kool-Aid
Said the NVidia Kool-Aid drinking fanboy. And the fun thing is, NVidia is in fact poisoning the market, and everyone is buying their crap.
Just ignore him. He’s a
Just ignore him. He’s a typical shilling sheep of a noVideot.
^ #amdgamersarestupid
^ #amdgamersarestupid
^ #amdgamersarestupid
Enjoy
^ #amdgamersarestupid
Enjoy your subpar AMD crap while the rest of the world enjoys their gaming experience.75%+ marketshare says it all ignorant fanboy.
#amdgamersarestupid
#amdgamersarestupid
I prefer #geforcemasterrace
I prefer #geforcemasterrace and I hate this whole hashtagging business 🙂
A good businessman makes an
A good businessman makes an effort to produce a product that is perfect in the intended environment, and usable in others if possible.
What does Nvidia do?
Remember Crysis 2 with its sea of waves below ground level?.
Just saying. 😛
Yes this is related to it,
Yes this is related to it, AMD still has poor tesselation performance apparently, and in this case it is for much more noticeable effects. But its not like they didn’t know about it, or try to address it, because Tonga (as bad as it is overall) has greatly improved tesselation over Tahiti.
Ryan running to defend his
Ryan running to defend his boss :love:
The only thing more damaging to the industry than nVidia are your site and the others like it.
As an AMD fan (using a
As an AMD fan (using a sempron right now to write this), I am happy that pcper is covering the story at all. As I would not have known about it otherwise. I found the article pretty unbiased because it invites the reader to draw his own conclusions.
Ryan sums it up accurately:
I know there’s been a lot of
I know there’s been a lot of QQ on GameWorks lately, so thanks for covering this PCPer. Let’s look at the facts:
1) Are GameWorks games better than console versions for anyone that chooses to enable these features? Yes.
2) Can GameWorks effects be disabled, so that perform as well or better than the console versions? Yes.
3) Is AMD currently struggling financially, making it less likely for them to be able to dedicate resources to game specific optimizations?
4) Has AMD tried to do the same with their own initiatives like Mantle and TressFX? Yes.
5) Should Nvidia and AMD continue to invest in resources to make their products more exciting and competitive on the marketplace? Yes.
To me, GameWorks is just another of the many reasons I prefer Nvidia over AMD. Sorry, I get a better game, with better features and better driver support on Day 1. AMD users can sit on their value brand option and decision-making all they like but crying about the premium offering performing better with features they developed just isn’t going to garner a lot of sympathy in my eyes.
I don’t know where everyone
I don’t know where everyone is getting this “value brand” BS from. If you are in the middle of a release cycle, everyone is the value brand, that’s how it works. Once you release a new card that trumps the other guy’s stuff then you are no longer the value supplier.
If your decision wasn’t based
If your decision wasn’t based on value, then I’m not sure why anyone would go with AMD at all, sorry. They are just so far behind when it comes to features and support, and this whole GameWorks dust-up is just another example.
None of this matters!
None of this matters! Hairworks is an obsolete DX11 library. When the game is ported to DX12 this goes away.
Rubbish comment of course,
Rubbish comment of course, like any middleware library it can and will be updated for DX12, just because the API is upgraded doesn’t mean it has the tools that GameWorks library provides devs to more easily implement these features.
None of this matters!
None of this matters! Hairworks is an obsolete DX11 library. When the game is ported to DX12 this goes away.
None of this matters!
None of this matters! Hairworks is an obsolete DX11 library. When the game is ported to DX12 this goes away.
None of this matters!
None of this matters! Hairworks is an obsolete DX11 library. When the game is ported to DX12 this goes away.
All of this goes away with
All of this goes away with DX12. IP Libraries solved basic inefficiencies of DX11. Game studios will now have to code objects and Textures to DX12 instructions and there will be no place for incompatible libraries.
If Witcher is DX11 ONLY then it sucks already.
More rubbish from typical AMD
More rubbish from typical AMD hope merchants. Deflect from current state and situation with big future promises, this is exactly the kind of problem AMD and their fanboys perpetuate, because nothing comes of these fantasies.
You’re not doing anyone any favors, and most certainly not yourself, in the meantime, GameWorks is relevant today and makes PC gaming better for everyone (especially Nvidia users).
“AMD GPU performance was
“AMD GPU performance was significantly lower than comparable NVIDIA GPUs, even though this game does not implement any GameWorks technologies. In that case, the game’s developer directly blamed AMD’s drivers, saying that it was a lack of reaching out from AMD that caused the issues.”
I’m sorry but AMD is supposed to reach out and hold your hand? No, that’s not how it works. If you want your game to work on as many platforms as possible you reach out to AMD and if you are ignored then you can complain, but not before then.
If it’s from a developer who
If it’s from a developer who is releasing a big title I’d say yes. AMD doesn’t have to respond to everyone, or anyone. However, cherry picking the biggest releases is just a smart move.
If it’s from a developer who
If it’s from a developer who is releasing a big title I’d say yes. AMD doesn’t have to respond to everyone, or anyone. However, cherry picking the biggest releases is just a smart move.
Even as an AMD fan I side
Even as an AMD fan I side with Ryan on this. It is AMD’s fault for not getting their foot in the door. The hair would be nice but I don’t really care that much. I do wish AMD would have worked to get TressFX in there.
Hairworks is a DX11
Hairworks is a DX11 workaround. DX12 renders most of nVidia’s IP libraries obsolete.
If Witcher is not already DX12 capable then it sucks already. Hair is nothing more than a texture.
DX12 will make everything work. Why? DX12 is Mantle in a Microsoft Windows 10 wrapper. It is the egreat equalizer.
DX11 was written to keep Intel and nVidia products competitive with AMD. By dummying down all gpu’s no one single dGPU or IGP maker would shine.
DX12 and Mantle is Intels greatest nightmare. CPU single core performance does not matter anymore; DX12 and Mantle unlock multithreaded and multicore scaling regardless of how the game is coded. If the game runs with DX12 the performance will fly as all CPU cores will process the game code for the GPU.
nVidia IP Libraries are a thing of the past. Now those libraries must work with an AMD developed API.
the only problem here is CDPR
the only problem here is CDPR didn’t want to spend time implementing two hair simulation solutions. maybe they chose NVidia because:
1) They have a huge market share advantage (more gamers will see the end result);
2) NVidia sent more developers to work on the game than AMD did (which sounds like none).
if CDPR or AMD had the funding to implement tressfx, they probably would have.
Checking out the early
Checking out the early benchmarks what bothers me most as a trend in recent Gameworks titles is how Kepler performance has tanked so badly.
The Witcher 3 benches at pcgameshardware.de show the GTX 780 being beaten by the GTX 960. That’s a $650 dollar card on release with 7 Billion transistors (cut down). Getting beating by a $199 card with under 3 Billion. You may be thinking wow isn’t Maxwell great and it is but no way in hell is it that much faster than Kepler or shouldn’t be. Shady stuff to see more cards? The trend is far from limited to that game either check out Project CARS as just one more example.
This, the Nvidiots don’t want
This, the Nvidiots don’t want to see this. Nvidia’s practices are hurting their cards as well with subtle (or not so subtle) planned obsolescence. As long as their libraries are closed source, they can always play innocent, but the benchmarks speak for themselves. I can’t wait for DX12/Vulkan to level the playing field.
Why is this so hard to
Why is this so hard to comprehend? Do we look at the 7800GTX and wonder in amazement how the 8800GT beat it handily just a few years later?
Kepler was great for its time but Nvidia designs hardware to perform best in games it thinks will be most relevant at that time. Maxwell had a number of significant improvements, so if Nvidia addressed some of the main areas Kepler was deficient, like DirectCompute, should it really be a huge surprise newer hardware closes the gap on older hardware, even if they are in different weight classes?
Let’s look at some other recent games like GTA5, where Kepler does just fine.
Are you going to wonder in similar amazement to see 285 Tonga with improved tesselation beating out 290 in heavily tesselated scenes knowing it has much improved tesselation hardware?
7800GTX(110nm) vs
7800GTX(110nm) vs 8800GT(65nm) just look at the numbering scheme and it should be enough to tell you why that example is in no way comparable to a GTX 960 beating a GTX 780. Especially considering they are both on the same 28nm node. Not hard to comprehend?
When you say Kepler was great “for it’s time” the thing is it’s time was pretty recently. Titan “classic” for example a $1000 card was only released this month in 2013. It barely beats the GTX 960 in the title. The benchmark I read it had identical minimums and only ~1fps higher averages.
Ignoring that, my point was that Kepler is only showing such poor relative performance in Gameworks titles. Look at some more vendor agnostic titles like you said GTA V and the Kepler is much more where it “should” be in the product stack.
Why would I wonder in amazement? That again is not a comparable situation to what I was discussing here.
Honestly, you could go down
Honestly, you could go down to a 9600GT and the comparison would fit just fine. 9600GT was a beast of a performer for its time and beat the 7800GTX handily as well.
Point is if you redesign hardware to excel at certain new features, you shouldn’t expect older hardware to perform as well when a game makes heavy use of those features.
Look back at the Maxwell review and you will see Nvidia made a lot of significant changes to their SMs that could theoretically improve throughput significantly, especially fewer shared resources for each SM which would drastically help DirectCompute throughput.
http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/2
Titan classic isn’t a great example because it was launched a year after Kepler launched in 2012, but that was just Nvidia dragging its feet along, compared to relatively launching the entire stack front in center with Maxwell with 970/980 and full GM200 Titan X just 4-5 months later.
We can see relatively speaking, Nvidia does achieve nearly 2x the performance over the predecessor part, so while a 3Bn transistor part coming close to a 7B transistor part is a quite a bit more than you would expect, its not a complete shock if newer games were using specific features Kepler was not well-suited for.
We already saw Kepler wasn’t that great for DirectCompute anyways, as they lagged behind AMD parts in games that made heavy use of DC with Global Illumination. Similarly, we see AMD cards lagging behind badly in this game and others due to deficient Tesselation performance.
People arguing that nVidia
People arguing that nVidia and/or CDPR should force development for AMD chips alongside nVidia chips are essentially mad at having backed the wrong horse.
This is no different from the way Playstation and Xbox fanboys fight back and forth about the merits of their respective platforms.
If those features matter to you, either wait for more powerful AMD cards or switch to nVidia platforms. If not – not. But either way, don’t cry.
Look up No BS podcast from
Look up No BS podcast from Maximum PC.
Tom Petersen and Rev Lebaradian flat out say they send out people to help integrate GameWork code into the engine and AMD is not privelage to that.
I don’t know how clear cut it is.
It also doesn’t help when those two say show us proof of a conctract and miniutes later they say providing a GameWorks contract isn’t possible because of legal issues.
We won’t know the real answer until someone breaks rank.
Gameworks is cancer
Gameworks is cancer
The game is being released
The game is being released today. I would recommend waiting for the real differences before making any outcries.
Why did you feel the need to
Why did you feel the need to include a 22 MB gif in this article?
This, PCPER wtf?
This, PCPER wtf?
So sick of the bs and lack of
So sick of the bs and lack of standardization. Nobody can tell that Nvidia is putting the customer first, nor is AMD, but they seem less piggy about it. If they were, they would be moving toward fundamental standardization, not this VHS/Betamax crap all over again.
So sick of the bs and lack of
So sick of the bs and lack of standardization. Nobody can tell me that Nvidia is putting the customer first, nor is AMD, but they seem to be less piggy about it. If these corps were putting us first, they would be moving toward fundamental standardization, not this VHS/Betamax crap all over again.
No BS 226: Interview with AMD
No BS 226: Interview with AMD Graphics Guru Richard Huddy: https://youtu.be/fZGV5z8YFM8
Nvidia Responds to AMD’s Cheating Allegations (No BS Podcast 229): https://youtu.be/aG2kIUerD4c
Claims that Nvidia intentionally cripples performance on its older generation cards:
https://forums.geforce.com/default/topic/806331/nvidia-intentionally-cripples-the-older-generation-kepler-video-cards-through-nvidia-experience-/
ludiqpich198 wrote:
“Before driver/geforce experience update:
http://i.imgur.com/w0PhEeQ.png
After:
http://i.imgur.com/o4vCtEj.png
gpu lost on
directx 9 simple -18 percent performance
directx 10 -23 percent performance
directx 11 -8 percent performance
a whole average 16 percent performance have been lost just so you can show your customers how much better your new generation is
STOP IT NVIDIA STOP CRIPPLING YOUR OLDER GENERATIONS STOP”