A blog post on NVIDIA's site indicates that Konami's upcoming game Metal Gear Solid 5: The Phantom Pain will make use of NVIDIA technologies, a move that will undoubtedly rankle AMD graphics users who can't always see the full benefit of GameWorks enhancements.
"The world of Metal Gear Solid V: The Phantom Pain is going to be 200 times larger than the one explored in Ground Zeroes. Because so much of this game’s action depends on stealth, graphics are a key part of the gameplay. Shadows, light, and terrain have to be rendered perfectly. That’s a huge challenge in a game where the hero is free to find his own way from one point to another. Our engineers are signed up to work closely with Konami to get the graphics just right and to add special effects."
Now technically this quote doesn't confirm the use of any proprietary NVIDIA technology, though it sounds like that's exactly what will be taking place. In the wake of the Witcher 3 HairWorks controversy any such enhancements will certainly be looked upon with interest (especially as the next piece of big industry news will undoubtedly be coming with AMD's announcement later today at E3).
It's hard to argue with better graphical quality in high profile games such as the latest Metal Gear Solid installment, but there is certainly something to be said for adherence to open standards to ensure a more unified experience across GPUs. The dialog about inclusion though adherence to standards vs. proprietary solutions has been very heated with the FreeSync/G-Sync monitor refresh debate, and GameWorks is a series of tools that serves to further divide gamers, even as it provides an enhanced experience with GeForce GPUs.
Such advantages will likely matter less with DirectX 12 mitigating some differences with more efficiency in the vein of AMD's Mantle API, and if the rumored Fiji cards from AMD offer superior performance and arrive priced competitively this will matter even less. For now even though details are nonexistent expect an NVIDIA GeForce GPU to have the advantage in at least some graphical aspects of the latest Metal Gear title when it arrives on PC.
So we Kepler users can expect
So we Kepler users can expect Nvidia to be closely developing Gameworks from early on but come game launch they somehow forget us and call it a bug.
GameWorks is effectively
GameWorks is effectively MaxWorks because Nvidia is too busy optimizing for Maxwell to catch all of the “BUGS” on Kepler. Y’all Kepler owners are sooo getting screwed.
Gameworks has come to.
Gameworks has come to.
AMD has successfully framed
AMD has successfully framed this debate in terms of so called “Open Standards,” but that isn’t really the case. Let’s look at AMD’s argument.
First, AMD’s TressFX uses only DX/OGL features that are common accross both vendors. This is true, as far as it goes. But, so does Nvidia’s HairWorks. It uses a different solution, but still only uses what is available in DX/OGL.
Second, AMD claims that their solution is open, while Nvidia’s is closed. Neither solution is FOSS, both allow developers to access the source code for Engine integration, and a license that allows its use. They can debate the terms of the license, but effectively they are equivalent for Developement purposes.
Third, AMD claims their solution is completely open, whereas what they provide is a source dump for their TressFX 2.2, when they have been advertising TressFX 3.0 features in Deus Ex. Nvidia is providing the latest version of their code to developers.
Fourth, AMD has couched their response as if their solution is the industry standard, which is really good for branding but not really true. And it isn’t good for AMD when Nvidia just lets that hang out their and calls their system the next generation, which puts AMD’s solution into the “old and stale” frame of mind.
Fifth, the way both companies support their Developer base is the real issue here.
As an example, Nvidia has taken the time to implement most of their GameWorks stuff into the Unreal Engine. Simply sign up for both the Unreal Engine and Nvidia Gameworks and you have access to the latest source code in a working state from both projects. And Nvidia fixes bugs that are found in the implementation.
AMD will let independent developers do the work, rather than compete with Nvidia on this front. By refusing to compete, you can’t then say the other party is being unfair. Otherwise, you are just the Music industry all over again.
Sixth, AMD complains that the implementation Nvidia chose is detrimental to the performance of AMD’s cards. While I applaud AMD’s efforts in making their solution cross vendor and performant, nothing says Nvidia has to follow the same playbook. Frankly, I hope HairWorks bites them into taking that into consideration, thanks to their solution giving a large percentage of gamers a bad experience. But that is up to the market to decide.
Nothing prevents a developer from implementing both solutions, other than it would be double the work. Nothing prevent a developer from combine the methods to get the best of both, so long as they don’t share trade secrets with the other guy.
With all of this said, I want AMD to succeed, because I like them, and I want competition. I would argue, actually, we need more than the two and a half players we have right now.
P.S. In the Unreal forums, the Nvidia guys have said that all but two of their currently implemented features were compatible with both vendor’s cards. At the time, they had not implemented HairWorks but it is compatible with both.
As long as Nvidia keeps
As long as Nvidia keeps Gayworks closed and purposely keeps AMD from being able to optimize, they’re harming this industry by needlessly splitting it. Yeah, AMD cards are mostly compatible with Gayworks, but what does it even matter if it performs like shit, and purposely performs like shit for no reason other than the fact that it cannot be optimized for on AMD cards? It effectively might as well not be compatible at all.
Do you people who defend Gameworks really want to see graphics card exclusive games that exist for no reason other than to impose an artificial vendor lockout? Because if this shit continues and forces AMD to do the same like I see Nvidiots saying they should that’s where we’re headed. I’m just glad AMD has not gone down that route already.
Its AMD who performs like
Its AMD who performs like shit, whereas some Gameworks effects just take advantage of that, e.g. utilizing tessellation.
other effects, such as HBAO+ have similar performance impacts on both.
keep on crying, AMDouchebags.
AMD cards don’t perform like
AMD cards don’t perform like shit in gayworks games because the card is shit, even with AMD’s lesser tessellation that is not the main problem. If they can’t optimize, how can they compete in the first place when it comes to software? They can’t, that’s the problem your Nvidiot brain can’t seem to comprehend.
My last card wasn’t even an AMD card. AMD fans aren’t the only ones who can’t stand this bullshit. You’re a moron for not realizing how bad this is for the overall industry, but not only that you’re too shortsighted and selfish to realize the consequences of Nvidia’s anti competitive business practices.
Um, AMD optimizes its drivers
Um, AMD optimizes its drivers for games they don’t have the source to all the time. It is part of their job, since they have chosen to take on that responsibility, rather than let lazy developers ship games that don’t play well on their card, or any card for that matter.
Actually, now that I think about it, isn’t this the same thing as cheating on benchmarks, since the driver is optimized to run each game differently depending on its requirements? Instead of forcing the Developers into actually fixing their Engine? Now we know the real reason Mantle was so important to AMD, and why Nvidia is going along with Vulkan, as it will reduce their load in the long run.
while gameworks effects look
while gameworks effects look nice they are bad for the idustry also amd arent the only cards that take a big hit the gtx 700 series cards take a big hit too.
And then with simple driver
And then with simple driver update (which probably just reduce max. tessellation factor) improve performance for this single game only xD
GameWorks is great tool to make sure your old series will get huge performance hit so you have to buy new one every bloody year.
Certainly great for nvidia!
So it isn’t just a shot at
So it isn’t just a shot at AMD then right, it hurts their own cards. You whiners are looking at it the wrong way (which you can fix in a config file etc just by dropping tessellation a bit). All they’ve done is MAXIMIZE the usage of the resources their Maxwell cards have. Should they run slower so AMD can catch up? That isn’t their fault. Perhaps CD Projekt could have done a better job at launch of dropping it down for all non maxwell models, but that isn’t Nvidia making something that DOES NOT WORK on AMD. IT does, just change settings and the problem goes away.
This whole argument is stupid, as I expect NV (or AMD) to show me the maximum I can get out of the hardware ALWAYS. Other people can tone it down if needed. If I buy a top card I want it MAXED, not toned down because the other guy, or even older NV cards are worse at operation X, Y, or Z. The game designer can adjust that, or AMD/NV can in app profiles for the games.
IF AMD had such a problem with hairworks, they should have went to CD Projekt 2yrs ago with tressfx help when they first saw the wolves on the screen like the rest of us. They came to them in the last 2 months when design was completely done and all that was left was polishing up the bugs. That is AMD’s fault, NOT Nvidia.
I am pretty sure I stated
I am pretty sure I stated what I did and didn’t want to happen. And identified myself as an AMD fan, not an Nvidia fan. I am foremost a fan of Truth, and not the BS that is being sold.
And I am not defending Nvidia. I am in fact calling AMD out for BS. There is a difference. If you find me online, you will find I quite frequently do the same for Nvidia’s BS as well, not to mention anyone else’s that I see.
I am writing this post, after all.
gayworks? seriously? how old
gayworks? seriously? how old are you son?
AMD in their SDK took also
AMD in their SDK took also time and most of those effects they offer made it easily implementable into variety of engines. GameWorks efekt are easy implementable but they are not part of UE and never were! PhysX is the only part of that engine and nvidia for that had to make it open!
Only issue AMD and many developers and community complaints is source code access, without which you cannot optimize, offer improvements nor debug. Nvidia has to do it all and I am saying only one simple thing. If there is a problem with code you have to complaint to someone who has access to it!
Pretty simple as that.
AMD Radeons SDK has not this issue, because their code is accessible with full documentation and samples and therefore they are not responsible if something went wrong as anyone can review the code and fix it themselves. And at the end that is what developers should do!
Thanks for the warning.
Thanks for the warning. Another game better not to buy.
As long as I can
As long as I can remember,every MGS looks eh… not too bad.
Everything Metal Gear Solid
Everything Metal Gear Solid V: The Phantom Pain need is a maximum TESSELLATION on the characters, those low-polygonal objects look a little bit not in the place in 2015 with their angular edges!!!
I hope they will use only
I hope they will use only effects not using tessellation at all, like HBAO, PCSS etc.
*pets his GTX 970 @ 1.5GHz /
*pets his GTX 970 @ 1.5GHz / 7.6GHz*
Hopefully the more
Hopefully the more compute-centric nature of DX12 results in these hardware-specific technologies becoming less important, but we’ll see.
generally yes but it will not
generally yes but it will not solve this fundamental problem we have with HW specific blackbox middleware.
Oh great, another game I’ll
Oh great, another game I’ll probably be better off buying for console than for PC.. :-/
how would you be better off
how would you be better off paying more for even less than PC graphics with Nvidia stuff turned off ?
oh I see, you want to punish devs by paying them!
no wait, you want to punish Nvidia by not buying PC port!
no, I think I got it, you want to punish PC “master race” by going peasant… yea, that must be it.
The answer is D. None of the
The answer is D. None of the Above.
Gayworks
Gayworks
I prefer Goyworks
I prefer Goyworks
This means it will probably
This means it will probably perform even worse on Kepler than on AMD.
Good to hear and I am a
Good to hear and I am a massive fan of GameWorks and what it adds to my gaming experience. Keep it up please guys and more of it 🙂
Like any discussion of
Like any discussion of Gameworks, all the intelligent, well informed people have come out to comment.
GameWorks isn’t meant to add
GameWorks isn’t meant to add anything to what already exists rather its purpose is to hammer performance, more on the competition side, so that Nvidia can sell their high end cards.
Look at Witcher 3! That game has no better hair simulation that Tomb Raider, and no better phyisics simulation than Finally Fantasty XV running on the PS4 or BF4.
Nvidia doesn’t care about progress in the game development industry nor do they care about gamers as consumers. Having all that processing done on a discrete GPU is quite inefficient when you have both Intel and AMD putting potent IGPs on their CPUs and when DX12/Vulkan reduce CPU time spent on draw call generation and validation by 20x times.
My words 🙂
My words 🙂
Yet another title ruined by
Yet another title ruined by Crapworks 🙁
Who cares how the hair of
Who cares how the hair of Snake looks … how about just enjoying the game for its own sake. Sure, graphical details add to the overall effect, but they don’t make the game.
If you can not play UHD
If you can not play UHD Bluray or HDR or Netflix 4K with NVIDIA 960/970/980
what HDCP 2.2 it is ?
what hdmi 2.0 it is ?
you can’t have DCI – P3 coloer
NVIDIA data bandwidth 10.2 GB
all information is cut Above 10.2GBPS
so what HDMI 2.0 it is ? nvidia style ?
why nvidia no registers data bandwidth ? 8.75 ? 10.2 ?