The AMD Argument
AMD launched a public salvo against NVIDIA and its GameWorks program this week. We sat down and talked with BOTH sides of this debate to find the real answers.
Earlier this week, a story was posted in a Forbes.com blog that dove into the idea of NVIDIA GameWorks and how it was doing a disservice not just on the latest Ubisoft title Watch_Dogs but on PC gamers in general. Using quotes from AMD directly, the author claims that NVIDIA is actively engaging in methods to prevent game developers from optimizing games for AMD graphics hardware. This is an incredibly bold statement and one that I hope AMD is not making lightly. Here is a quote from the story:
Gameworks represents a clear and present threat to gamers by deliberately crippling performance on AMD products (40% of the market) to widen the margin in favor of NVIDIA products. . . . Participation in the Gameworks program often precludes the developer from accepting AMD suggestions that would improve performance directly in the game code—the most desirable form of optimization.
The example cited on the Forbes story is the recently released Watch_Dogs title, which appears to show favoritism towards NVIDIA GPUs with performance of the GTX 770 ($369) coming close the performance of a Radeon R9 290X ($549).
It's evident that Watch Dogs is optimized for Nvidia hardware but it's staggering just how un-optimized it is on AMD hardware.
Watch_Dogs is the latest GameWorks title released this week.
I decided to get in touch with AMD directly to see exactly what stance the company was attempting to take with these kinds of claims. No surprise, AMD was just as forward with me as they appeared to be in the Forbes story originally.
The AMD Stance
Central to AMD’s latest annoyance with the competition is the NVIDIA GameWorks program. First unveiled last October during a press event in Montreal, GameWorks combines several NVIDIA built engine functions into libraries that can be utilized and accessed by game developers to build advanced features into games. NVIDIA’s website claims that GameWorks is “easy to integrate into games” while also including tutorials and tools to help quickly generate content with the software set. Included in the GameWorks suite are tools like VisualFX which offers rendering solutions like HBAO+, TXAA, Depth of Field, FaceWorks, HairWorks and more. Physics tools include the obvious like PhysX while also adding clothing, destruction, particles and more.
However, there are some major differences with GameWorks compared to previous vendor-supplied code and software examples. First, AMD and several game developers claim that GameWorks is a “black box” with only API calls used to access the GW functionality. The term “black box” is used to indicate that little is known about what is going on inside the GameWorks libraries themselves. This is because GameWorks is provided as a set of libraries, not as a collection of example code (though this is debated by NVIDIA later in our story). Because of its black box status, game developers are unable to diagnose buggy or slow game code when using GameWorks and that can lead to issues with different hardware.
A section of AMD's developer website with example code for download.
You might be wondering already why this is different than something like PhysX? Looking at GPU accelerated PhysX only, that particular plugin ONLY runs on NVIDIA hardware. Adding it or changing the implementation does not negatively affect the performance of the AMD or non-PhysX code path. Many of the GameWorks toolsets (basically everything except PhysX and TXAA) though do in fact run on both AMD and NVIDIA hardware if they are implemented by the developer. That means that visual effects and code built directly by NVIDIA is being used on AMD GPUs in GameWorks enabled titles like Watch_Dogs. You can see immediately why this could raise some eyebrows inside AMD and amongst the most suspicious gamers.
This is different than what has been the norm for many years. In the past, both AMD and NVIDIA have posted code examples on their websites to demonstrate new ways of coding shadows, ambient occlusion and other rendering techniques. These could be viewed, edited and lifted by any and all game developers and implemented into their game engine or into middleware applications. GameWorks is taking this quite a bit further by essentially building out a middleware application of its own and licensing it to developers.
The obvious concern is that by integrating GameWorks with this “black box” development style, NVIDIA could take the opportunity to artificially deflate performance of AMD graphics cards in favor of GeForce options. That would be bad for AMD, bad for AMD users and bad for the community as a whole; I think we can all agree on that. AMD points to Watch_Dogs, and previous GameWorks titles Batman: Arkham Origins, Call of Duty: Ghosts and Assassin’s Creed IV: Black Flag as evidence. More interestingly though, AMD was able to cite a specific comparison between its own TressFX hair library and HairWorks, part of the library set of GameWorks. When using TressFX both AMD and NVIDIA hardware perform nearly identically, even though the code was built and developed by AMD. Using HairWorks though, according to numbers that AMD provided, AMD hardware performs 6-7x slower than comparable NVIDIA hardware. AMD says that because TressFX was publicly posted and could be optimized by developers and by NVIDIA, its solution provides a better result for the gaming community as a whole. Keep in mind this testing was done not in a real-world game but in internal testing.
NVIDIA's updated Developer site.
AMD has made other claims to support its theory on the negative impact of GameWorks. The fact that NVIDIA’s previously available DirectX 11 code samples were removed from NVIDIA developer site is a damning statement as it would indicate NVIDIA’s desire to move away from them to GameWorks exclusively. But, as we show on the following page, these code samples were not removed but simply relocated into a difference section of NVIDIA's developer site.
With all of this information it would be easy to see why stories like the one at Forbes found their way across the Internet. However, as I found out after talking with NVIDIA as well, there is quite a bit more to the story.
AMD,
Grabbing at straws and
AMD,
Grabbing at straws and sounding like the ill fated step child.
Dear AMD,
Sack up and do something? perhaps releasing “new” and “powerful” tech at the same time as Nvidia instead of being 2 steps behind them.
“freesync”
Again another AMD ploy to sound like hero’s and that they actually did something….wrong again.
Ok now all you AMD poor kids start up your hater aid engines and let it fly.
go buy a titanz meanwhile
go buy a titanz meanwhile i’ll just get 2 295x2s…
toss on another 500+$ to get
toss on another 500+$ to get a PSU that can power those cards
why would u need 500$ for psu
why would u need 500$ for psu ?
you know a single one runs fine on a 850watt psu ?
“i’ll just get 2 295x2s…”
“i’ll just get 2 295x2s…” <-- read what he said
Ummm… you mean… $159?
Ummm… you mean… $159? Yep, that’ll buy you a band new Gold rated 1300 watt EVGA SuperNova power supply on Newegg RIGHT NOW after a small $35 dollar rebate! Even a 1500 watt brand name Silver rated PSU can be had for just $299… I suppose you COULD spend $449 dollars if you wanted to, but why?
SOURCE: http://www.newegg.com/Product/Product.aspx?gclid=CjgKEAjw2KCcBRCQ_6mztcunhEgSJABPxOF1IhnBS56b4whcrX2TmtCPuwCGCadcrY5el3cbJa7tVfD_BwE&Item=N82E16817438011&nm_mc=KNC-GoogleAdwords&cm_mmc=KNC-GoogleAdwords-_-pla-_-Power+Supplies-_-N82E16817438011&ef_id=h2hM99SiAQAAB2M:20140530155330:s
You are NOT the target market
You are NOT the target market for TitanZ. People using pro apps but can forgo the need for ECC and full driver support by NV ARE the target.
Check prices of Tesla K20 and you’ll find it’s MORE for 1/2 the cores. Check prices of a Quadro 6000 and you’ll find a pair of those will set you back $10000 for the same core count.
You are complaining from an ignorant gamers view of the product. It also runs 375w instead of 500w for the competition. These will fly off the shelve to people looking at the two BILLS I listed above.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814132010&cm_re=tesla_k20k-_-14-132-010-_-Product
$3200 for less than 1/2 cores and 3.52Tflops SP/1.17Tflops DP of compute 5GB mem, while Titan Z has 2.66tflops DP and 12GB.
What do they say this is USED for?:
CFD, CAE, financial computing, computational chemistry and physics, data analytics, satellite imaging, weather modeling
http://www.newegg.com/Product/Product.aspx?Item=N82E16814133494&cm_re=quadro_6000-_-14-133-494-_-Product
$5000 for ONE. You’ll need two to match the TitanZ. Get it? $10000.
http://www.fool.com/investing/general/2014/06/02/why-is-nvidia-charging-3000-for-the-titan-z.aspx
Also note it’s nearly two times Fire Pro S10000 at 1.48Tflops DP with 6GB and also $3000. Again it takes 2 to catch TitanZ which is far less watts also. Titan Z is for DP people who happen to game on the side, not for gamers who have no idea what DP is for.
The fanboyism is off the
The fanboyism is off the charts.
If you understood how adaptive sync (what freesync has become) and g-sync work you’d realise how wrong you are. The very fact that VESA has adopted the adaptive sync as part of the displayport spec shows how important that little freesync demo was. AMD came along with the tongue-in-cheek name of free-sync because it is technology that already existed and can be implemented independent of the GPU when NVidia was touting it as completely revolutionary (when it already existed in notebooks as a power saving feature) and requiring proprietary hardware on particular graphics cards.
Not a fanboy, just someone who knows how to objectively analyse what’s happening unlike someone.
We have seen how well g-sync
We have seen how well g-sync works, we have yet to see how well freesync works or will work. With All open standards like won’t work as well as the closed one since has to allow for a lot.
AMD had a demo of free-sync
AMD had a demo of free-sync at Computex last year. The fact that VESA are adopting it is big too, it means they’re convinced enough to spend time implementing it.
All the technology does is allow the GPU to control the VBLANK interval, it’s nothing that will be particularly better on any particular card or that will benefit from a proprietary solution.
i believe what people want to
i believe what people want to see the most is how freesync run real games. i’m not takes sides here. when nvidia come up with g-sync they show real demo running real games. that’s how they convince people that the tech is real. for AMD part they only show that windmill video. yes there is no external monitor exist yet to run freesync but why amd can’t show real games under freesync using the same demo unit that they show us? we want to know about AMD implementation that will deal with screen tearing and input latency at the same time
That demo did not show
That demo did not show “freesync”. It showed a static refresh rate matched to framerate.
AMD has not shown any demo of working freesync.
so VESA are incompetent and
so VESA are incompetent and you are the Pro, if they added it it’s because it is working well, and why jump the gun anyway, can’t you wait couple more months and judge ? or too hasty to hate ?
> With All open standards
> With All open standards like won’t work as well as the closed one
Except you should realize VESA is not one of open standards. It’s a body that writes “ALL” display standards. Exclude all VESA standards, none of your displays from PC to phone will work. That’s why FreeSync getting in is a huge thing. It’s optional in DP 1.2a, which probably means it will mandatory in DP 1.3.
Good luck getting G-Sync into any displays afterwards.
Ok, freesync is a good think,
Ok, freesync is a good think, but you must remember that the true equivalent of Gsync isn’t freesync alone.
A freesync monitor must be combined with a tripple buffer middleware/third party soft to make the equivalent work of gsync module/monitor.
It’s very much probable that the freesync+triple buffer combination doesn’t work properly that a gsync solution.
That’s pure
That’s pure speculation.
Also, I’ve heard nothing about any kind of software or buffer interface. Can you please sight a reliable source for this information?
Let me know when AMD gets
Let me know when AMD gets around the SCALER problem that NV ran into which caused them to make this nasty terribly expensive FIX called Gsync. They stated they made it because scaler’s couldn’t do the job. That’s so terrible of them isn’t it? I mean what kind of company goes around jerks holding up the train? Oh, the good kind. Did you complain when AMD came up with AMD64 when Intel wouldn’t until forced? We’d probably be on Itanic right now if
Freesync will cost Scaler makers R&D and then it becomes ummm, chargesync? NOT-so-FREEsync? Whatever, and that’s if they can get them to do it at all. Not to mention we have to see it work, as gsync is pretty proven tech and everyone liked it. They won’t do the R&D for free and will pass whatever costs they incur directly to end users whether AMD likes that or not. Only fools work for free while trying to run a business. I’ll be shocked if they don’t put a premium on it while they can also. No different than blowing up the radeon pricing over mining. Nobody can be forced to use the new Vesa sticker on their monitors. They can just ignore it and use gsync or nothing.
The fact that AOC, Philips, Viewsonic, BenQ, Asus & Acer are all coming with models means something too. They realize freesync is likely a year away or more and only if Scalers up their game, and it isn’t a known commodity vs what they can sell RIGHT NOW with Gsync for the next year. Only samsung/LG are really left of the big names. I call that WIDE support from everyone. Did you read the PCper article? I’m pretty sure he and everyone else pointed out the scaler issue NV ran into and that AMD will have to convince them, while NV apparently couldn’t. Maybe AMD will have more luck now that they see Gsync can steal their Scaler sales 😉 But again, that should mean we thank NV not chastise them.
AMD’s driver team is an order
AMD’s driver team is an order of magnitude smaller and less talented. It’s unfortunate, but true.
was certainly true in the
was certainly true in the past, but since the end of 2013 their drivers are really great and frequent, multi gpu scalling, optimisation, fixes frame pacing, now multi rez support for eyefinity, this was the one reason i didnt get eyefinity, because i had the monitors, and couldnt use them, and getting 2 others was money spent for no reason, now that i gave away the old monitors, the driver solve the issue xD.
so i know that many ppl are in the same position and would benifit alot from this eyefinity update.
What does any of your comment
What does any of your comment mean in regards to the previous posters comment that AMD’s driver team is specifically for its graphics, in fact, much smaller and much less funded than Nvidia’s
Just imagine how much better the drivers would be if they had equal funding.
the fact is AMD drivers were
the fact is AMD drivers were crap up to Q3 of 2014, the other fact is AMD Driver are much better for the last 8 months or so.
as i said above more frquent better features, better scalling, faster fixes and optimisations.
so i dont think they are still small and under funded, something probably happened, and it’s probably linked to the success of Mantle.
Q3 2013 sorry for the typo,
Q3 2013 sorry for the typo, cant edit >.<
I don’t think it has much to
I don’t think it has much to do with Mantle, in fact Mantle can’t yet be called a “success.” Its not used by enough developers and it hasn’t had enough time to really mature, but with that said, Mantle certainly has changed the API paradigm for the better, we’re getting rumblings out of Redmond, WA. AMD managed to wake the lumbering giant for a (more than likely) expedited release of DX12. I think it just has more to do with the fact that AMD is really at the top of their game right now realizing that if they don’t push the limits and really challenge their competitors, they won’t survive. I’m really happy with how far AMD has come in the past year, its been very exciting to watch AMD realize drivers that are arguably better than Nvidia in many areas. Personally I find “GeForce Experince” to be a terrible application for overclocking. I much prefer AMD’s overclocking implementation.
You know nothing about
You know nothing about Nvidia’s drivers obviously. GeForce Experience does no overclocking whatsoever. It is a really great application that automatically updates the graphics driver when a new one comes out, which is at least once a month. And it also recommends the game’s graphics settings based off of your computer hardware so that you get the best experience possible. It also controls shadowplay and the lighting effects of the card itself. You had no idea what GeForce Experience was did you?
I don’t really think that’s
I don’t really think that’s true actually… AMD simply needed a fire lit under their asses to get into gear (re: frame rating) and they’ve shown just how talented they truly are by going from inoperable frame pacing to excellent frame pacing in an industrially small amount of time of just a few months – quite an accomplishment really.
You are kidding right?
You are kidding right? Freesync was good enough that it was accepted into the VESA standard of displayport. Yeah, that means an AMD technology will be in every displayport starting with the new revision. Basically, G-sync is pretty much dead unless you like paying 150$ more for a proprietary chip built into a monitor that is a good thing. I love Nvidia, in fact I own 2 780’s, but their proprietary codes and technologies are crippling the pc gaming industry. Hell their deal with microsoft to keep pushing direct x is one of the reasons our gpu’s can only do 25% of what they are capable of.
You are a just a blind follower.
First let it be known that I
First let it be known that I have no preference for either Nvidia or AMD, I buy simply what suites my needs best at the time.
New and powerful tech? Like? You clearly have no idea how much AMD has contributed to the x86, GPU, and API industries, forcing their competitors to advance the industry rather than let it stagnate. For example, the 295X2 made such an impact that Nvidia was FORCED to POSTPONE the release of TitanZ BECAUSE of AMD right? lol What are you, like 13 years old?
Without AMD, Nvidia would have no competition and therefore almost no incentive or initiative to produce all of the innovations you probably think are so wonderful. You obviously have no clue or understanding of economics or business let alone the dynamics involved in the GPU and graphics industry. When making comments like the one above you just sound like an ignorant fanboy, which is really unflattering and embarrassing for yourself and everyone that has to read it in this forum. Maybe you should take your misplaced enthusiasm for Nvidia and disdain for AMD somewhere else where informed and educated PCper readers don’t have to read your ridiculous rhetoric that in no way contributes to a constructive dialogue.
You are apparently unaware DX
You are apparently unaware DX 12 was being worked on for a few years already and that mantle didn’t cause this to happen it was the natural course as of action. Also that OpenGL already has the ability to drop draw calls etc? So mantle wasn’t needed as the other two api’s were already well on the way before mantle.
Since titan Z isn’t aimed at gamers I fail to see you point, we have no idea what the delay was. You work for NV? It seems clear to me that if you’re not aiming at gamers, 295×2 wouldn’t even be in your thoughts. They didn’t even change the clocks after the delay, everything stayed the same. If they had raised them 100mhz or 200 maybe we could assume it was due to AMD but they did nothing but push it off it seems.
What did they change that shows they were FORCED to delay? ROFL@ you blasting the other guy while doing the same thing you accuse him of (glorious AMD and their contributions to mankind….LOL). hardocp just tested Bf4 again (770 article posted a day or two ago), and it was basically break even. Is that all Mantle gets you? NV made the game’s use of mantle moot with a simple dx11 update. I can see why they say they don’t fear it. IF they don’t have time to read his/her rhetoric, surely they have none to read your personal attacks of the guy either. He/she is 13yrs old, you’re basically calling him/her stupid (he/she not one of the “informed and educated people), he/she clueless, the person is an ignorant fanboy…ROFL.
You appear to be describing yourself I’m afraid.
oh i will love to see if you
oh i will love to see if you are still singing nvidias praises when AMD goes under and suddenly NVidia is charging a thousand dollars for the then equivalent of a 750ti
(followed by stagnation as nvidia without anyone to compete against suddenly realises they can now get away with small performance increases with each generation if they tie it into some alleged new technology that all games suddenly start using)
Really good and interesting
Really good and interesting read Ryan.Good job.
Thanks Dan. Even thought it
Thanks Dan. Even thought it wasn't my longest editorial, it took the most back and forth calls and emails in a LONG LONG time. 🙂
most in depth read about the
most in depth read about the issue on the internet
and oh look Amd’s new 14.6 beta driver kinda squashes the whole watch dog issue.
http://www.hardocp.com/article/2014/05/27/watch_dogs_amd_nvidia_gpu_performance_preview/5#.U4aCiCgVeaM
it doesnt, the issue is still
it doesnt, the issue is still here, and Ryan doesnt seem to grasp the problem and understand it.
the probleme with gameworks, even if the developper pay the licence for the code and have access to it, the developper is under NDA and cannot tell AMD the code for them to fix it, so in the end nvidia will fix the code for the developper for the AMD hardware without AMD being able to optimise it the way they want, but the way Nvidia or the devs want.
and if you guys dont think this is a real issue then lol, or should AMD pay for the licence for every effect at every game to be able to optimise it ?
the implementation of gameworks is retarded.
Yea, this is a really good
Yea, this is a really good article. Thank you for trying to be as objective as possible.
Time well spent. Well
Time well spent. Well articulated, concise, and full of pertinent information on both sides of the issue.
An EXCELLENT read. This article further supports my belief as to why PCper is the premier computer/technology sites online today, making strides in everything from editorials like this one, to truly unbiased articles and we can never forget you and your team’s contribution to the ground-breaking introduction of a new GPU benchmarking methodology that actually led to a paradigm shift in the way that GPU’s are now tested. Its hard to believe that in the recent past the FRAPS metric had the final say in most GPU benchmarking.
So, thank you Ryan and everyone else involved with PC Perspective!
Agreed. Only place I’ve seen
Agreed. Only place I’ve seen that has the full story and in a way that I could actually follow.
This does sound like AMD
This does sound like AMD doing what they can to create controversy to make it look like they are getting unfairly treated, But they are no victim in any of this as they have done same thing. The tressfx thing last I used it, wasn’t even over nvidia/amd cards, amd cards had a pretty massive advantage when i last used it, 50-55% to 90-100% load difference with it off/on. Kinda wonder if AMD is so focus on mantle they don’t want to bother with DX optimizations anymore?
the difference is in the
the difference is in the aproach…not the technology nor any supposed advantages. AMD is just stating that they were getting locked out of early optimizations.
1 it’s not true tressfx taxe
1 it’s not true tressfx taxe nvidia and AMD equaly.
2 you can disable it
it’s not t(he case of gameworks library, depanding on how it’s implemented and how many features, if you think that 1 tressfx could be annoying, imagine 10 techs running simultanously, each running 2% better, you end up with 20% boost.
That’s assuming all 10 techs
That’s assuming all 10 techs (even if they were running simultaneously) are running at 100% load, I think you’d have to take a mean/average approach to reach a more accurate number as a 20% boost would be an absolute best case for the hypothetical scenario you’ve come up with – although despite this, your point as merit I believe.
That’s assuming all 10 techs
That’s assuming all 10 techs (even if they were running simultaneously) are running at 100% load, I think you’d have to take a mean/average approach to reach a more accurate number as a 20% boost would be an absolute best case for the hypothetical scenario you’ve come up with – although despite this, your point as merit I believe.
That’s assuming all 10 techs
That’s assuming all 10 techs (even if they were running simultaneously) are running at 100% load, I think you’d have to take a mean/average approach to reach a more accurate number as a 20% boost would be an absolute best case for the hypothetical scenario you’ve come up with – although despite this, your point as merit I believe.
It took less that a week for
It took less that a week for Nvidia to update it’s drivers to run TressFX as good as it was running on AMD hardware. Many Nvidia users/fanboys where laughing after that, saying that TressFX was running better on their cards than on AMD cards.
So, if you want to say a big fat lie, at least say something believable or try another site, maybe the Cosmopolitan site, where people have no clue about hardware and will believe whatever you say to them.
Quote”Kinda wonder if AMD is
Quote”Kinda wonder if AMD is so focus on mantle they don’t want to bother with DX optimizations anymore?”
Reply: They can’t do that due to the limited number of titles that will actually use Mantle, Maybe on those titles they’ll be less bothered about how DX run’s the game but before then they need to iron out the kinks so that shouldn’t be any time soon.
Hi there i am kavin, its my
Hi there i am kavin, its my first time to commenting anyplace, when i read this article i thought i could also create comment due to this sensible post.
No wonder COD:Ghost ran so
No wonder COD:Ghost ran so poorly. Black Ops 2 ran like butter…
didn’t run that great on my
didn’t run that great on my gtx780 so wouldn’t put the blame on nvidia for that.
Arbiter, i agree i keep
Arbiter, i agree i keep getting a c0000005 crash error code from running the game on my GTX 780 every 15mins of game play.
Well mine was a bit worse,
Well mine was a bit worse, random full system freezes.
well ryan overall good
well ryan overall good article, still has here and there weird issues.
like when you said Mantle takes time from DX developement, while you said on another portion that it’s the job of the devs to make sure the game works correctly, like if AMD is asking devs to release an unfinsihed DX game, it’s not their job it’s the studio’s job to manage resources and time to see if they can afford adding Mantle to DX or not.
and AMD is in competition with Nvidia, how could anyone remotly ask AMD to trust their competition, to run an optimised library for them, and doesnt use it to keep perf in check and boost whenever they need a wonder driver PR.
personaly i think that games using gameworks should be codded for regular vendors then coded exclusivly for nvidia, this way users can chose to enable or disable gameworks rather than being forced to run it, because in all honesty it’s fishy.
or even better a developper that want gameworks should get Mantle too, and have 2 versions, would be epic xD
and ALSO i dont know if gameworks is what made watch dogs so UGLY, but you need to see video comparaison of E3 2012 and the one released it’s really uglier.
Quote “and ALSO i dont know
Quote “and ALSO i dont know if gameworks is what made watch dogs so UGLY, but you need to see video comparaison of E3 2012 and the one released it’s really uglier.”
It’s nothing to do with gameworks, What happened with the promotional videos is a fairly common practice in the industry. The early peek stuff often looks a lot better than the end product because there using all there available resources to develop a small example of the end product for the promotion video. Those resources won’t then be enough to produce the same effects for the full open world game.
yea well the thing about
yea well the thing about watch dogs was it’s graphics, you had the feeling it might be the next gen game, then with the release it turnd out to be another washed up colors, very common graphics, i was disapointed.
and i was expecting an immersive game with good scenario, but since they delayed it, they added some weird crap, like challenges and weird online mode that stripped the game from the immersion, what the hell is a robot spider is doing there…come on
overall this game was a disapointment for me, now i fear the same for The Division Tom clancy, it’s another Ubisoft game, looks stunning and promessing at the E3, also was delayed and probably end up with a crappy game at the release.
i really think it’s Ubisoft that forces the studios to do stupid stuff on their games, and for that i dont like them, and with gameworks i like them even less.
here is a video comparaison
here is a video comparaison between E3 2012 et the release version PC Ultra settings 1080p.
look at the weather, look at the lighting, look at the colors, fog, reflections look at the details, damn even look at the physx E3 version the jacket goes well with the movement the wind etc the relsease it’s almost still, and the cerise sur le gateau look at the stutter at sec 0:41.
http://www.youtube.com/watch?v=L_A6Z3gkXlk#t=50
i mean nothing in the release is like the footage, the game have been degraded dramaticly, sriously can a player sue ubisoft for false marketing over this ? because i would love to see a class action suit against ubisoft for misleading and scamming players over and over again.
Ryan, you forgot about the
Ryan, you forgot about the Wii U also uses the AMD GPU.
Yeah, definitely AMD behaving
Yeah, definitely AMD behaving immaturely, but it is somewhat fishy (not saying anything done purposely) that watch dogs does run so poorly on AMD hardware. They obviosuly optimized it the most for nvidia and probably completely ignored AMD. Can’t blame nvidia so much for that though. Still sucks to see PC gaming going this way.
As for the gameworks thing it’s obviously nvidia going for a cash-grab. Don’t know how much it costs or whether nvidia would just outright deny AMD a license to the source code. In the end, it’s likely bad for small budget developers. Sad to see stuff like this pop up just as game engines are suddenly becoming affordable.
Also is the big money still in console game sales? Also sad to hear…
If its like whole physx
If its like whole physx thing, AMD refused to license it, and then came out and did this. Not long after that AMD whined about not being able to use physx on their cards but never made mention that could licensed it and added be didn’t.
Yes, that’s true that AMD
Yes, that’s true that AMD never licensed physx… but how much did nVidia want to charge ATI/AMD to license it?
If it was a reasonable price, then it’s AMD’s bad, if it was an unreasonable price, then it’s nVidia’s bad.
So arbiter, what were the terms that nVidia offered to AMD to license physx? Unless you know those terms, how do you know that AMD was at fault?
Both companies have their own self interests to protect, so both do things that definitely benefit themselves more than the other company. Neither are innocent.
arbiter
You are forgetting
arbiter
You are forgetting something about PhysX. With PhysX Nvidia wants to have the right to have an opinion about what hardware you are using. If there is anything in your PC case that doesn’t say “Nvidia”, PhysX support will be disabled.
In the case of PhysX Nvidia is punishing it’s own customers if they are not loyal enough.
We are talking about a company that will not hesitate to be unethical if this means getting an (unfair) advantage over the competition.
Um, unethical? Seriously?
Um, unethical? Seriously? Creating proprietary API’s is unethical? Buying the rights to a technology (PhysX) and wanting to license it is unethical? Well geez guys, how about every company just close up shop because apparently breathing is unethical now. Give me a break. It’s not an unfair advantage, it’s called being competitive. Besides, why is PhysX still being brought up? How many games actually ended up using it in any meaningful manner? Not many. If I had to guess, I would say that it is due to the fact that consumers didn’t buy into it enough for developers to justify implementing it.
Normally I don’t fan the flames on posts like this, but the unethical word being thrown around just rubs me the wrong way. Do yourself, and us, a favor, and learn what unethical business practices are. Hint: Enron is a good place to start looking. Lehman Brothers too.
This isn’t what I wrote and
This isn’t what I wrote and you know it.
So, if you have something to say about what I wrote, stop trolling and answer based on what I wrote, don’t put your words into my mouth. It is easier to put words in someone else’s mouth, but it is also meaningless.
deleted
deleted
You know what’s unethical, if
You know what’s unethical, if you have both an AMD, and NVidia card in your box, PhysX is still disabled because it can sense you have an AMD card also installed, regardless if you have paid for and installed NVidia hardware as well.
*drops mic*
You know what’s unethical, if
You know what’s unethical, if you have both an AMD, and NVidia card in your box, PhysX is still disabled because it can sense you have an AMD card also installed, regardless if you have paid for and installed NVidia hardware as well.
*drops mic*
It’s even funnier than that.
It’s even funnier than that. Someone connected a USB display in his system, Nvidia driver thought that the USB display was in fact a graphics card from another manufacturer, so it disabled the PhysX.
Good read, very unbiased. I
Good read, very unbiased. I think it is great amd is doing this now before it gets out of hand. If nvidia is doing what amd is accusing them of, then nvidia is heading down a real slippery slope. As a gamer, I see that as really bad. All we need is amd to do the same thing they are accusing nvidia of. I don’t want to have two systems to run different games because they don’t run properly on both gpu brands.
Well AMD has been doing same
Well AMD has been doing same thing they claim nvidia is doing. Just sad AMD is whining about the same game both sides are playing.
When did amd do the same
When did amd do the same thing?
Nvidia is doing stuff like
Nvidia is doing stuff like this for a long long time. Remember DirectX10.1 and the first Assassin’s Creed?
Ubisoft and Nvidia are in $$$LOVE$$$ for years.
Aint it fair, for Ubisoft to
Aint it fair, for Ubisoft to optimize the PS4/XB1 to AMD Jaguar & the PC ver to CUDA core?
Its seems logical for all future Console+PC optimization route.
Years ago I liked NVidia, but
Years ago I liked NVidia, but their piggy proprietary selfishness turned me off. Still does. I like the AMD shaders much better, though the stuttering can be irritating. I’d like to try a dual titan against my Ares 2 but the absurdity of the price makes it impossible. I’m told that Nvidia is “smoother” than AMD, but lacking a side by side comparison of my own I cannot say. I’d like to see AMD prosper given they tend to be more open with their creative process, and I’d like them to invest more in the driver team.
when was the last time you
when was the last time you used your AresII
have you tried your aresII with 14.xx catalyst drivers ? because the stutter on crossfire been gone for half a year now, framepacing on the drivers solved frametime for AMD, many games now have more stutter on nvidia than AMD card.
why does ppl still live in 2012 problemes ? keep up with the change
It could be that he’s using
It could be that he’s using multiple monitors, the stuttering wasn’t entirely fixed and there are still set ups that will have issues.
I believe Ares II drivers are
I believe Ares II drivers are made by ASUS, and not AMD.
We all know by now how Nvidia
We all know by now how Nvidia works. There is nothing new here other than that AMD is talking about this.
Nvidia will do just about ANYTHING to secure it’s market share, ethical or unethical it doesn’t really matter to them.
I can understand them, but I can’t support their tactics.
Though Ryan if you follow
Though Ryan if you follow devs from many developers on twitter, you will know they are firmly on AMD’s side. That’s something worth your 3rd page I think.
I honestly think your further thoughts section is where truth is. At the end, it’s the game developer matters.
While I don’t like a hardware
While I don’t like a hardware company making middle ware that may benefit their hardware I’m not sure if this is as bad as AMD makes it seem. The fact that all the current generation consoles use AMD GPU means they aren’t going to get left behind.
One thing that bothers me
One thing that bothers me about the extra licencing option where developers can get and update the source code is that at no point does Nvidia take improvements back. This promotes not only that every developer that tries to use this (should be a time saver) has to buy the source code if they want it to run well on AMD but also optimise it themselves. This model while able to get AMD cards performing well is still treating AMD badly.
But on the other hand if AMD wants to it can take its code examples, build itself a middle ware and open source the lot, producing a competitor that both AMD, Nvidia and all the game publishers can contribute to. That way performance can be optimised and every game and vendor benefits as best as possible from the collaborative effort and reducing developer time to get these effects working. Seems like a simple enough fix. A free competitor like this would eat Nvidia’s lunch and force gameworks out.
I still remember the advert
I still remember the advert from AMD that low end gpus run COD with battlefield 4 and mantle, like a GTX770.
Every company does that. And in the other hand, AMD owns consoles now, so AMD is not in a bad place…or they care.
Another game that comes with NVIDIA gpu support is “the witcher 3”, we will see the same topics at that time once again.
Thank god I game with EVGA
Thank god I game with EVGA
LOL what does that have to do
LOL what does that have to do with this?
Maybe you meant Nvidia?
It is time for someone with
It is time for someone with big GigaBucks to step in and help one of the mobile SOC GPU makers to re-enter The desktop GPU market, under the conditions that All Drivers/Middleware/APIs be developed as an open source project, or use the current open standard Khronos API/drivers/Middleware. The hardware developed to be sold for the discrete GPU market, and server markets with no limits, other than the limits stated above. In other words said Big GigaBucks outfit that loans the GigaBucks, can take these discrete GPUs and utilize them as open GPGPU accelerators, for their Open Server based web, computing, and VR gaming SKUs! This will free the hardware up from proprietary software/API/Middleware manipulation, and the manufacturer will have to compete on the ability of the hardware alone, the Manufacturer(helped with the loan) will also have the right to sell these GPUs on the open market, but the drivers/APIs/Middleware will have to be Open sourced. No Open sourced drivers/APIs/Middleware, and no development Low interest/No interest loan.
To Mr. Rory Read and The
To Mr. Rory Read and The Incompetent AMD Employee:
Even as a AMD investor, I don’t see AMD having any logic sense on this.
Do you or are you able to understand “building block” approach? When people choose using building block approach, do the building bloack providers provide all the details of the building blocks other than specs, how to use, and the detail interface?
Do you or are you able to understand “IP?” When people choose using IP, do the IP providers provide all the details of the IP other than specs, how to use, and the detail interface?
Why not splitting the game console business with nVidia?
The illoigal employee needs to be held accountable for damaging AMD’s image.
Stop wasting time on non-sense and go back to work to produce meaningful earnings, not $0.02 a quarter!!!
Ubisoft doesn’t care about
Ubisoft doesn’t care about the PC market.
There is a pattern here:
-Far Cry 3: still today some micro stuttering (reason: bad engine, unfixable at Nvidia/AMD driverteam level)
-Assasins creed 4 Black Flag: Completely unoptimized, no tipple buffering. Proof: “Optimizing games to run more efficiently on PC hardware isn’t important, Assassin’s Creed 4 associate producer Sylvain Trottier has claimed”
“It’s always a question of compromise about the effect, how it looks, and the performance it takes from the system. On PC, usually you don’t really care about the performance, because the idea is that if it’s not [running] fast enough, you buy a bigger GPU. Once you get on console, you can’t have this approach.” (hinting you should buy the 1000 dollar Nvidia GTX Titan, the only video card at the time that did run the game maxed out 60fps @ 1080P)
-Watch Dogs: No official statements yet (I believe?). But reviewers didn’t receive the PC version early, unlike the PS4 version. TotalBiscuit from YouTube stated that he got the PC version 10 hours before launch (instead of a week early). Off course Ubisoft did know how crap the PC version performance (and stuttering) was at the launch and didn’t want any bad reviews at launch apparently. A lot can be said about Watch Dogs but it still runs like total crap, even on a 5000 dollar gaming-PC.
PC Perspective: First of all thanks for this great podcast and article! Please get in contact with Ubisoft and ask them what’s there problem.
This would go a long way in
This would go a long way in Nvidia’s favour if they hadn’t played dirty by offering source and non-source code versions and just offered a source code version. No one, especially the tech industry, likes silly free market concepts being abused like this. Does AMD also do this with Mantle?