I feel like every few months I get to write more stories focusing on the exact same subject. It's almost as if nothing in the enthusiast market is happening and thus the cycle continues, taking all of us with it on a wild ride of arguments and valuable debates. Late last week I started hearing from some of my Twitter followers that there were concerns surrounding the upcoming release of The Witcher 3: Wild Hunt. Then I found a link to this news post over at Overclock3d.net that put some of the information in perspective.
Essentially, The Witcher 3 uses parts of NVIDIA's GameWorks development tools and APIs, software written by NVIDIA to help game developers take advantage of new technologies and to quickly and easily implement them into games. The problem of course is that GameWorks is written and developed by NVIDIA. That means that optimizations for AMD Radeon hardware are difficult or impossible, depending on who you want to believe. Clearly it doesn't benefit NVIDIA to optimize its software for AMD GPUs financially, though many in the community would like NVIDIA to give a better effort – for the good of said community.
Specifically in regards to The Witcher 3, the game implements NVIDIA HairWorks technology to add realism on many of the creatures of the game world. (Actually, the game includes HairWorks, HBAO+, PhysX, Destruction and Clothing but our current discussion focuses on HairWorks.) All of the marketing and video surrounding The Witcher 3 has been awesome and the realistic animal fur simulation has definitely been a part of it. However, it appears that AMD Radeon GPU users are concerned that performance with HairWorks enabled will suffer.
An example of The Witcher 3: Wild Hunt with HairWorks
One of the game's developers has been quoted as such:
Many of you have asked us if AMD Radeon GPUs would be able to run NVIDIA’s HairWorks technology – the answer is yes! However, unsatisfactory performance may be experienced as the code of this feature cannot be optimized for AMD products. Radeon users are encouraged to disable NVIDIA HairWorks if the performance is below expectations.
There are at least several interpretations of this statement floating around the web. First, and most enflaming, is that NVIDIA is not allowing CD Project Red to optimize it by not offering source code. Another is that CD Project is choosing to not optimize for AMD hardware due to time considerations. The last is that it simply isn't possible to optimize it because of hardware limitations of HairWorks.
I went to NVIDIA with these complaints about HairWorks and Brian Burke gave me this response:
We are not asking game developers do anything unethical.
GameWorks improves the visual quality of games running on GeForce for our customers. It does not impair performance on competing hardware.
Demanding source code access to all our cool technology is an attempt to deflect their performance issues. Giving away your IP, your source code, is uncommon for anyone in the industry, including middleware providers and game developers. Most of the time we optimize games based on binary builds, not source code.
GameWorks licenses follow standard industry practice. GameWorks source code is provided to developers that request it under license, but they can’t redistribute our source code to anyone who does not have a license.
The bottom line is AMD’s tessellation performance is not very good and there is not a lot NVIDIA can/should do about it. Using DX11 tessellation has sound technical reasoning behind it, it helps to keep the GPU memory footprint small so multiple characters can use hair and fur at the same time.
I believe it is a resource issue. NVIDIA spent a lot of artist and engineering resources to help make Witcher 3 better. I would assume that AMD could have done the same thing because our agreements with developers don’t prevent them from working with other IHVs. (See also, Project Cars)
I think gamers want better hair, better fur, better lighting, better shadows and better effects in their games. GameWorks gives them that.
Interesting comments for sure. The essential take away from this is that HairWorks depends heavily on tessellation performance and we have known since the GTX 680 was released that NVIDIA's architecture performs better than AMD's GCN for tessellation – often by a significant amount. NVIDIA developed its middleware to utilize the strength of its own GPU technology and while it's clear that some disagree, not to negatively impact AMD. Did NVIDIA know that would be the case when it was developing the software? Of course it did. Should it have done something to help AMD GPUs more gracefully fall back? Maybe.
Next, I asked Burke directly if claims that NVIDIA was preventing AMD or the game developer from optimizing HairWorks for other GPUs and platforms were true? I was told that both AMD and CD Project had the ability to tune the game, but in different ways. The developer could change the tessellation density based on the specific GPU detected (lower for a Radeon GPU with less tessellation capability, for example) but that would require dedicated engineering from either CD Project or AMD to do. AMD, without access to the source code, should be able to make changes in the driver at the binary level, similar to how most other driver optimizations are built. Burke states that in these instances NVIDIA often sends engineers to work with game developers and that AMD "could have done the same had it chosen to." And again, NVIDIA reiterated that in no way do its agreements with game developers prohibit optimization for AMD GPUs.
It would also be possible for AMD to have pushed for the implementation of TressFX in addition to HairWorks; a similar scenario played out in Grand Theft Auto V where several vendor-specific technologies were included from both NVIDIA and AMD, customized through in-game settings.
NVIDIA has never been accused of being altruistic; it doesn't often create things and then share it with open arms to the rest of the hardware community. But it has to be understood that game developers know this as well – they are not oblivious. CD Project knew that HairWorks performance on AMD would be poor but decided to implement the technology into The Witcher 3 anyway. They were willing to sacrifice performance penalties for some users to improve the experience of others. You can argue that is not the best choice, but at the very least The Witcher 3 will let you disable the HairWorks feature completely, removing it from the performance debate all together.
In a perfect world for consumers, NVIDIA and AMD would walk hand-in-hand through the fields and develop hardware and software in tandem, making sure all users get the best possible experience with all games. But that style of work is only helpful (from a business perspective) for the organization attempting to gain market share, not the one with the lead. NVIDIA doesn't have to do it and chooses to not. If you don't want to support that style, vote with your wallet.
Another similar controversy surrounded the recent release of Project Cars. AMD GPU performance was significantly lower than comparable NVIDIA GPUs, even though this game does not implement any GameWorks technologies. In that case, the game's developer directly blamed AMD's drivers, saying that it was a lack of reaching out from AMD that caused the issues. AMD has since recanted its stance that the performance delta was "deliberate" and says a pending driver update will address gamers performance issues.
All arguing aside, this game looks amazing. Can we all agree on that?
The only conclusion I can come to from all of this is that if you don't like what NVIDIA is doing, that's your right – and you aren't necessarily wrong. There will be plenty of readers that see the comments made by NVIDIA above and continue to believe that they are being at best disingenuous and at worst, are straight up lying. As I mentioned above in my own comments NVIDIA is still a for-profit company that is responsible to shareholders for profit and growth. And in today's world that sometimes means working against other companies than with them, resulting in impressive new technologies for its customers and push back from competitor's customers. It's not fun, but that's how it works today.
Fans of AMD will point to G-Sync, GameWorks, CUDA, PhysX, FCAT and even SLI as indications of NVIDIA's negative impact on open PC gaming. I would argue that more users would look at that list and see improvements to PC gaming, progress that helps make gaming on a computer so much better than gaming on a console. The truth likely rests somewhere in the middle; there will always be those individuals that immediately side with one company or the other. But it's the much larger group in the middle, that shows no corporate allegiance and instead just wants to have as much fun as possible with gaming, that will impact NVIDIA and AMD the most.
So, since I know it will happen anyway, use the comments page below to vent your opinion. But, for the benefit of us all, try to keep it civil!
AMD has become irrelevant in
AMD has become irrelevant in the x86 market it seems. Competing against intel is a monumental task. AMD has had every opportunity to stay on level ground in the GPU market. Nvidia is not intel; there is no reason why they cannot compete with Nvidia and thrive. Poor management, beginning with Hector Ruiz, not Nvidia, is to blame. Unlike AMD, Nvidia, not notwithstanding all Nvidia’s hyperbole about their other ventures, focused on the GPU and its importance in computing going forward. Any success Nvidia has had in the non gaming segments is because of GPU tech they use in those endeavors.
No longer matters.
When you
No longer matters.
When you own 75% of the discrete GPU industry you can do whatever you want. Developers are going to go where the market goes. AMD needs to get their shit together.
I’m so tired of all this
I’m so tired of all this crying. NO ONE IS FORCING ANYONE TO USE GAMEWORKS!!! IF IT RUNS LIKE CRAP, DON’T FREAKING ENABLE IT!
Or ask AMD to create their own version, and get off nVIDIA’s coat tails. I’m so freaking tired of these AMD gimps that have this gimme gimme gimme complex. They demand to pay the least amount of money for hardware, and demand the competition to spend their time and money and give them features to make their gaming experience better – for FREE!
Last year Huddy said Mantle could be open to all, but says not now because it’s a closed beta that only runs on AMD’s GCN hardware he continually says is better. Then what happened? AMD handed Mantle to Khronos.
Huddy says FreeSync can go as low as 9Hz, but current monitors have a minimum of 40Hz.
Huddy says there is no license fees or added cost to the BOM to FreeSync displays, yet their first 1440p monitor (BenQ) has an MSRP of $799us.
But, but, but people say if AMD goes away nVIDIA will raise the shit out of their prices. Really? Is that what Intel did with their CPU’s? Is that what Windows did with their OS? NO! Because if it were to happen, consumers would speak with their wallets and guess what would happen? Prices would DROP!
#amdgamersarestupid
Well said +1
Well said +1
So in your world will people
So in your world will people turning off gameworks be able to pay less for the game?
Looking beyond the nvidia-amd spat, the people who are really to blame are the developers; they’re choosing to deliver a sub-par experience to part of their customer base.
Or they’re delivering a
Or they’re delivering a premium experience to those that have hardware with support. Using this logic you could say the game should cost less because the card you bought in 2004 can’t run the game very well or at all.
This hair feature is not required to enjoy the game on any level. It’s a visual plus. Stop laying around being entitled and butt hurt.
+1, AMD fans need to stop
+1, AMD fans need to stop making excuses for AMD and accepting their explanations and simply demand better support.
What needs to happen with AMD is they need to stop trying to poorly feature match Nvidia, rein it in and at least come through with what they’ve already promised to-date. Then go from there.
I don’t like what Nvidia is
I don’t like what Nvidia is doing, it can only end with titles that are crippled on one brand, if amd choses to do the same thing, or a monopoly from nvidia if amd ends up not being able to compete. That’s why I will be voting with my wallet and my next gpu will be an AMD one.
If the Tesselation
If the Tesselation performance on AMD cards didn’t suck as much as it does probably wouldn’t even be talking about this. AMD fans would nit pick another issue to be pissy on
True, tessellation is not
True, tessellation is not great on most AMD cards, but the last one it was actually improved a fair bit. Your comment(s) make it sound like it’s bad on all cards and never going to improve, not true at all. Being first to release a technology doesn’t automatically make the competitors bad or worse.
That being said, your second sentence describes you exactly, here you are again being nitpicky and pissy about AMD cards as usual.
So GCN sucks at tessellation.
So GCN sucks at tessellation. There are also a few games where GCN performs better than Maxwell, so what?
So GCN sucks at tessellation.
So GCN sucks at tessellation. There are also a few games where GCN performs better than Maxwell, so what?
TressFX DirectCompute is
TressFX DirectCompute is faster then HairWork CUDA tessellation
http://wccftech.com/tressfx-hair-20-detailed-improved-visuals-performance-multiplatform-support/
http://cdn3.wccftech.com/wp-content/uploads/2014/09/tfx_tr_perf.png
It also doesn’t look as good,
It also doesn’t look as good, but if AMD is willing to dedicate the man hours to implementing it into games, I will be happy to use it over the competing alternative of vanilla console mode.
I’m certainly not going to cry about it, like AMD fans seem to love to do.
Another similar controversy
Another similar controversy surrounded the recent release of Project Cars. AMD GPU performance was significantly lower than comparable NVIDIA GPUs, even though this game does not implement any GameWorks technologies.
Yes it does.
Turbulence, PhysX Particles
In addition, I love how no blame is falling on the publishers and developers….
does the feature available on
does the feature available on AMD cards?
anyway not siding with NVDA or game dev but game dev specifically mention lack of cooperation from AMD and so far AMD did not say anything to deny that. if that’s not really the case AMD will have say something. like what happen with Arkham Asylum. people want game to work well on both hardware but people always disregard AMD financial situation. when there are problem with certain hardware setup people are quick to blame game dev first.
“In addition, I love how no
“In addition, I love how no blame is falling on the publishers and developers….”
Pretty much this. It’s up to the dev to decide if they want to implement a proprietary technology, and one that possibly favours one competitor over the other. If performance was taking such a major hit on AMD and non 9xx Nvidia cards, then it should never have been implemented. CDPR must be thanking the gods at the current smokescreen that is up. The only saving grace is that you CAN turn it off if you don’t need it.
As an Nvidia user for the
As an Nvidia user for the limited time (~3 years) I’ve been a PC gamer, the value adds you get with Nvidia just make it the smart choice to new GPU buyers. GameWorks, driver quality, G-Sync, power efficiency, DSR, etc. are all nice features that improve the PC gaming experience. If Nvidia is able to outcompete AMD on these fronts, it only makes sense that gamers would start choosing Nvidia over AMD.
I really hope that AMD is able to match Nvidia in their implementation of similar technologies in addition to matching their GPU performance in the future. Competition is good for everyone. AMD just needs to step up their game to match Nvidia.
easier said than done.
easier said than done. especially when they are becoming more and more of the minority and have tight budget in their financials.
GameWorks, driver quality,
GameWorks, driver quality, G-Sync, power efficiency, DSR, etc.
Radeon SDK (which is not black boxed), driver quality is excelent with omega, Freesync (standard and cheaper, adoptable by anyone), VSR (as better quality according reviews) etc.
But all worse than the
But all worse than the competing Nvidia option, which doesn’t cost that much more to begin with. In any case, you’ve made your decision and now you get to stand on the outside looking in.
so another “fanfair” between
so another “fanfair” between red and green boys :p
The source code argument is
The source code argument is crap since it is not software that they are selling. We look for the hardware to purchase. if there is really no hindrances then they can release the cost just to satisfy the gaming community.
If we are to believe AMD Roy
If we are to believe AMD Roy (and I do) nVidia pays game developers millions to sign their gameworks contracts. So which game dev will turn that down? Not many!
On Project Cars:
http://www.reddit.com/r/pcmasterrace/comments/367qav/mark_my_word_if_we_dont_stop_the_nvidia_gameworks/
TLDR: Project Cars has PhysX baked in to the game engine. Even nVidia users who turn their PhysX OFF see performance tank. This points to both poor engine design and conspicuous relationship with nVidia.
On Witcher 3: AMD I feel isn’t in game dev’s faces enough, despite all their work around Mantle, they’re still lacking. There’s no reason why they can’t make sure CDPR put in TressFX for example or Mantle support, it’s worth throwing money and some AMD engineers at it for such a title.
There’s going to be an AMD Witcher driver released next week, but it takes someone asking the question via twitter to know that, and it’ll probably be released some days after release. That says it all. They need to be ALL OVER new high end PC titles. Instead they’re busy hyping up the early access games Dirt rally. Really.
AMD did the same thing while
AMD did the same thing while they still had funds with BF4 and their continued EA/Mantle support deal. They rolled the DICE (literally, pun intended) and they failed.
Crying about it now is just sour grapes/sore losing at its best.
This is not uncommon, and not
This is not uncommon, and not at all unethical, it is part of the competitive tools in their tool box. The develop a technology that benefits their product and they want game developers to incorporate it, thereby increasing demand for their hardware. Game developers have costs associated with performing this task so they ask for Nvidia to help with development funds.
AMD paid millions for game developers to re-code the engines for Mantle when it was the hot topic. In your world you need to be just as critical of AMD other wise you are hypocritical.
Fanboys,….. sheesh.
Ryan Shrout and Nvidia
Ryan Shrout and Nvidia Perspective users to the rescue.
pfft AMD has crappy DX11
pfft AMD has crappy DX11 tessellation performance so its all game dev’s fault for that.
Its the game developers fault
Its the game developers fault for choosing to use technology that doesn’t work for part of their customer base.
nvidia users would reasonably have the same reaction if a game needlessly relied upon integer math (see coin mining) where AMD offers significantly better performance.
Except it does work, it just
Except it does work, it just doesn’t run optimally on AMD hardware. You think it would be better to remove the feature and take away that choice to make the game better overall? That’s an interesting take. As unremarkable as AMD’s similar attempts have been with TressFX, I still appreciate them over the vanilla console version.
Also, the fact Nvidia commands an overwhelming majority of their user base probably makes it easier for them to take risks and implement features they know will run well on that hardware.
In actuality it runs pretty
In actuality it runs pretty sub-optimally on Nvidia hardware as well. Even a fairly solid Nvidia card like the 970 goes from playable to not at 1080p. You basically need a Titan X to hold decent frame-rates with Hairworks enabled.
So a true next-gen title to
So a true next-gen title to push the limits of PC gaming, what’s the problem again?
PeasantStation 4 and HoboBox One version are that way ——————->
I would be totally behind
I would be totally behind that in principle but it really doesn’t push the boundaries graphically in my opinion. And certainly doesn’t justify spending $1000 just for some hair effects.
Well, that is your opinion
Well, that is your opinion followed by what you would subjectively pay. Oh and don’t forget, if you don’t like it you can just turn it off. 🙂
Personally I think it looks great and any features that enable more graphical fidelity and effects is welcome over the bland console version.
I am sure you will just complain in a week about that next crappy console part, but don’t worry, that wont’ be completely contradictory, or anything!
Pushes the limits? Its been
Pushes the limits? Its been consolized clearly from the videos and screenshots out now compared to 2013. No where near earlier vids because they downgraded for the consoles. The devs admitted that. Fair enough. The only thing pushing the limits is “hairworks”. Whoopee.
They downgraded and you still
They downgraded and you still have PC havenots crying about some features being too performance expensive, imagine if they didn’t tone it down, you’d have PC gamers crying even louder than they are already. In any case, the game still looks great, and will be one of the best looking games on the PC without a doubt, so yes, pushing the limits.
Funny how everyone says
Funny how everyone says nvidia should stick with using standards. Guess what DX11 Tessellation IS A STANDARD. So they did what AMD fans been whining for and yet they still get attacked for doing it. Pretty sad.
Someone posted some results i would guess from a review copy
Notice how r9 285 suffered half the performance drop the 290x did using hairworks. 285 has 50% FASTER tessellation performance…. theory seems to be confirmed that bad tessellation performance on AMD’s part is cause.
Witcher 3, Review-Version 1.02 1080p, Ultra-Details, Ingame-AA Catalyst 15.4.1 Beta, Geforce 352.86 WHQL.
R9 290X Hairworks enabled percentage drop in minimums = %37.5
R9 285 Hairworks enabled percentage drop in minimums = %30.2
GTX 970 Hairworks enabled percentage drop in minimums = %21.9
GTX 770 Hairworks enabled percentage drop in minimums = %17.7
or i could be the fact(and cd
or i could be the fact(and cd project red have confirmed) that they were not able to do any optimization on amd cards because the contract with nvidia says they can’t . and the tessellation that the hair has is called line tessellation which makes no graphical difference from normal tessellation because it takes up 40%of the amd cards but only 10% on the nvidia they are making the devs put it in to make nvidia cards look better.the same line tessellation was in cod ghosts and it had the same outcome
The more we are grateful, the
The more we are grateful, the more happiness we get. Makanan Untuk Kelenjar Tiroid
Any amd or nvidia gpu I’ve
Any amd or nvidia gpu I’ve owned performs poorly with just about any gameworks features. I have a titan x and I can’t enable any features in far cry 4 because it tanks the framerate, up until recently every feature other than enhanced godrays was broken anyway. I play at 1440 though so the experience at 1080p may be slightly different. In my opinion these features aren’t really worth getting upset over because nvidia’s own products perform poorly with most of these features. Many of them give an almost unnoticeable improvement to image quality. Nvidia push these features to extend the load on graphics hardware giving some people the idea that they need to upgrade to the newest nvidia gpu to get the true ultra experience. In reality the features are probably intentionally inefficient in the rendering pipeline.Of course I am a somewhat cynical person, but nvidia are in the business of making money so my theory can’t be completely dismissed.
I’m pretty sure physx is
I’m pretty sure physx is causing performance tanking w/ radeon GPUs in project cars.
It’s fairly well known that physx on the CPU uses a highly unoptimized x87 code path and slows everything down (which allows nV to make the claim that their GPUs are great for physics; in reality this isn’t so).
Basically crippling other vendors, no real excuse there.
Sadly for AMD, there is no fix for project cars. They will struggle and waste their time.
Its kind of a joke because
Its kind of a joke because they spent months optimizing for PS4 and XBone who both have a AMD GPU but some how when it comes to PC the developers are confused once they implement GameWorks ?
XB1 and PS4 don’t have to
XB1 and PS4 don’t have to deal with AMD’s abysmal drivers lol. Mantle copied those low level APIs, but it failed, so you are stuck with AMD’s poor performing DX11 drivers. Things should improve with DX12, but until then, anything CPU intensive is going to hurt on an AMD GPU.
lol. another person that not
lol. another person that not really understand PhysX. x87 or not the CPU portion of PhysX will run on CPU regardless you have nvidia gpu or not. i don’t know why some people believe that PhysX will run entirely on GPU if you have nvidia gpu. if that’s the case any Unreal Engine 3 based game will perform very badly on 360 and PS3
Yes, but non-nVidia GPU, the
Yes, but non-nVidia GPU, the GPU portion of PhysX will have to run on CPU as well.
simply not the case at all in
simply not the case at all in games that have gpu physx. the only exception probably borderlands 2.
You mean PhysX works on AMD
You mean PhysX works on AMD GPUs?!
simply not the case at all in
simply not the case at all in games that have gpu physx. the only exception probably borderlands 2.
Does the game works well on
Does the game works well on Intel GPU?
Works great at 320×240! just
Works great at 320×240! just look at that sweet VGA image! 🙂
Why don’t AMD make their own
Why don’t AMD make their own GameWorks, HairWorks, HBAO+, PhysX, Destruction and Clothing. And make it more open.
Just like Freesync vs G-sync.
AMD created Mantle when API’s needed to get better.
What I mean is that they have the manpower and skills to make it happen.
And it would make AMD the good guys and shame Nvidia into doing the right thing. Share their code and help the community
And then AMD’s TrueAudio could be open so everyone could use it.
Funny how you say Mantle,
Funny how you say Mantle, which was closed source, NEVER was open like they claimed it would be.
It never was able to *harm*
It never was able to *harm* nV users, it just benefited GCN users and hurried the other software houses to DX12/OGLN faster. So they didn’t open source it, but that’s hardly relevant now. What is your problem with AMD? They are not the ones with shady AF business practices.
It is perfectly relevant when
It is perfectly relevant when AMD cries about one thing and then goes about doing it themselves. Except they failed with Mantle.
Bottomline is both companies are going to try to improve their products in an effort to compete. Nvidia just does a far better job of envisioning, executing, and supporting their initiatives. This is why I, like the overwhelming majority of the market, prefer their products over AMD when price and performance is similar.
Just line up all the different initiatives. You will see Nvidia tends to stand by their product while AMD features often fall by the wayside and become abandonware/vaporware.
If you read over many of these comments you will see, their fans aren’t doing themselves any favors they just make excuses for AMD when they SHOULD be demanding better. Just too much “we are the hapless victim mentality” that has gotten them here to begin with.
The Mantle plans changed,
The Mantle plans changed, because Khronos adopted it as Vulkan, which is essentially Mantle 2.0 and an Industry Standard.
Also DX12 is said to be based on Mantle, only Microsoft did some more refactoring.
There is no need anymore for Mantle itself, if the developers go with Vulkan or DX12. Mantle is a good testbed for any next gen implementation, though.
DX12 isn’t based on mantle.
DX12 isn’t based on mantle. AMD claimed MS had no plans for DX12 when they announced their proprietary and locked API. MS then gave them the kick in the balls they deserved when they said DX12 was in the works a month after the mantle announcement.
AMD has Radeon SDK with many
AMD has Radeon SDK with many effects. AMD true audio is based on tensilica IP and is also x86 compatible, for its acceleration you dont need any special HW, you CPU can handle that (if fast enough)
AMD has some good effect
AMD has some good effect libraries, like TressFX for example, and they are all Open Source, and should work pretty good on NVidia hardware:
http://developer.amd.com/tools-and-sdks/graphics-development/amd-radeon-sdk/
They just don’t usually pay the game devs to use them.
Problem is with those
Problem is with those “libraries”, How well will AMD support them with expert help? we kinda know the answer already to that.
how true audio can be open
how true audio can be open when AMD design the hardware into their gpu themselves? if they want to open that then they need to give their hardware design to others
Thanks for highlighting the
Thanks for highlighting the issue, Ryan. I expected a far more sophisticated analysis on this issue than a simple “this is your choice consumer” simplification. These issues aren’t just that simple and you should use your resources to provide a better analysis.
So after reading up on
So after reading up on project cars
“Another similar controversy surrounded the recent release of Project Cars. AMD GPU performance was significantly lower than comparable NVIDIA GPUs, even though this game does not implement any GameWorks technologies. In that case, the game’s developer directly blamed AMD’s drivers, saying that it was a lack of reaching out from AMD that caused the issues. AMD has since recanted its stance that the performance delta was “deliberate” and says a pending driver update will address gamers performance issues.”
It seems the author of the paragraph above seriously misunderstands the situation. The paragraph is saying that AMD failed to reach out to the devs but the latest official statements from the developer confirms that there was communication with AMD throughout development. The P:Cars game engine, however, is based on physX; it can’t be disabled. That should be an immediate red flag because physX can only run on nvidia GPUs or the CPU. And on the CPU, physX intentionally runs like garbage. AMD apparently iterated on their drivers for this game but they’re ultimately stuck where they are because nvidia wanted the developer to incorporate more physX features near the end of development and this blew out the gains they made.
So the game seems to be designed from the bottom up to heavily favor nvidia. That seems unfortunate imo but I guess it is OK with everyone else?
Oooh! That’s quite suprising
Oooh! That’s quite suprising if true. Source please?
if that’s the case PC with
if that’s the case PC with AMD based gpu should run bioshock infinite like a crap. Hitman Absolition too.
Yes Project Cars is CPU
Yes Project Cars is CPU limited and as we have seen numerous times, AMD’s multi-threaded DX11 driver is horrendous. Instead of improving it in the years since DX11 launched, they’ve been wasting money tinkering with failed Mantle. So it is no surprise that AMD’s perf with PCars is awful since their driver is probably competing with the main PhysX thread.
There’s quite a bit of evidence of all this btw, The DX12 Mantle API test shows this quite obviously where AMD’s MT and ST driver is identical, but both are much lower (about 60%) of Nvidia’s SP driver performance. Secondly, there’s a DX12 AMD driver that gives some 20-30% uplift, so clearly it is an AMD driver issue and a CPU utilization issue at that, most likely.
Another stupid statement from
Another stupid statement from ChiZoW
That doesn’t explain why Kepler cards are doing poorly in benchmarks. Unless you believe a 960 is better then a 780 or Titan all of a sudden.
Another stupid comment but
Another stupid comment but Anonymous AMD Fanboy #2425
If those games make heavy use of features Kepler is weak in, ie. Direct Compute, then of course it is not surprising to see the 960 close the gap.
Are we amazed when we see 8800GT outperform 7800GTX?