Mid-June kicked up a storm of poop across the internet when IGN broke the AMD optimizations for Frostbite 3. It was reported that NVIDIA would not receive sample code for those games until after they launched. The article was later updated with a statement from AMD: "… the AMD Gaming Evolved program undertakes no efforts to prevent our competition from optimizing for games before their release."
Now, I assume, the confusion was caused by then-not-announced Mantle.
And, as it turns out, NVIDIA did receive the code for Battlefield 4 prior to launch. Monday, the company launched their 331.58 WHQL-certified drivers which are optimized for Batman: Arkham Origins and Battlefield 4. According to the release notes, you should even be able to use SLi out of the gate. If, on the other hand, you are a Civilization V player: HBAO+ should enhance your shadowing.
They also added a DX11 SLi profile for Watch Dogs… awkwarrrrrd.
To check out the blog at GeForce.com for a bit more information, check out the release notes, or just head over to the drivers page. If you have GeForce Experience installed, it probably already asked you to update.
I said it in the past, I am
I said it in the past, I am saying it now. AMD and Nvidia are totally different as companies. AMD will not screw anyone who doesn’t uses it’s products or is using it’s products alongside with Nvidia’s.
On the other hand we all know how Nvidia works and how much Nvidia respects competition and it’s customers. It simply DOESN’T.
you mad bro?
you mad bro?
Furious :p
Furious :p
How do you know that is what
How do you know that is what would of happened? if the rumor didn’t come out saying nvidia was not gonna get final code, which HAS happened before (tomb raider) it could of end up happening that nvidia did get shafted on final build. As for you sayin’ nvidia would done it, do you have proof of that? AMD is a company lookin to make profits just like anyone else, mantle is a tech that no one else even if its “open source” as they claim its just a PR move since they know nvidia or any other makers (if there were any) wouldn’t use it either. I bet mantle will be even less supported then PhysX is.
How is the performance of
How is the performance of Nvidia cards in Tomb Raider? Can you use AMD’s TressFX with Nvidia cards? Do I have to say more? Maybe remind you of DX10.1 and Assassin’s Creed? How about the locks on PhysX? All Nvidia fanboys where smiling with CUDA and PhysX. Now with Mantle, something that it isn’t even here, everyone is worried about the future of PC gaming.
Come on…
I’ve seen tressFX and its not
I’ve seen tressFX and its not even noticeable when playing tomb raider (yes i do own the game). PhysX in most games i play is much more noticable when you go from it being off and on. Mantle i can already say it won’t happen outside a few games that AMD writes a massive check for due to low lvl access that it provides which could be PR nightmare when a game has a small bug using it and causes machines to crash or even messing up someones video card.
So you admit that PhysX
So you admit that PhysX creates two categories of gamers with one having a better gaming experience and that it fragments the PC gaming market. Also you are saying that Mantle will not succeed and that TressFX do not change gaming experience for the gamer. You are also speculating that Mantle will create the same problems for Nvidia that Nvidia is creating all this years for AMD.
So, what exactly is your point, that only Nvidia can screw everybody and only Nvidia fanboys like you can have a better gamimg experience?
Mantle ditches the Direct X
Mantle ditches the Direct X in favor of its own, which also puts to question what graphic changes need to be made since direct x is pretty much outta the picture.
You mast be having nightmares
You mast be having nightmares when sleeping at night, seeing Mantle as a big monster grubbing eating your Nvidia graphics card.
Anyway, you keep saying the same stuff, ignoring what I write. That indicates someone who doesn’t know anything and he is just repeating what people are saying in Nvidia boards, or a troll.
Have a nice day.
I’m a troll and I’m here to
I’m a troll and I’m here to say that PhysX, TressFX, Mantle; it’s all gimmicks. Outside of the unsustainable triple-a titles with millions of dollars for a budget, nothing will use any of these technologies, ever. All this tech is just speeding the industry to the inevitable video game crash.
When it comes to what gamers are actually playing, none of this really matters.
Reading what you say, you
Reading what you say, you have no clue what is going on. You should start reading up on things and learn up on what mantle is before you start talking. I know what mantle is, clearly you have no clue.
about the physx lock out IMO
about the physx lock out IMO AMD were asking for it. before the lock out nvidia did offer amd to license the tech so it can run off their gpu. i heard few conspiracy theory here and there and in the end AMD reject the offer since they were working to make bullet with opencl as their weapon against nvidia physx. it has been 4 years since then but to this day i still haven’t seen game with gpu accelerated physics other than physx.
AMD was always incapable to
AMD was always incapable to promote anything, with the exception of X86-64, so no wonder Bullet Physics are nowhere to be seen. When Nvidia bought Ageia I was happy, because I knew that they had the potential to do something good with it. Unfortunately they locked it.
PhysX would have been everywhere today and many people would be using Nvidia cards as secondary if Nvidia wasn’t locking it.
Now, about that you are saying. If AMD is not playing the game the way you meant it to be played, you don’t punish the customer. And who gives you that right?
For example let’s say that I had an 8800GT. I decide to buy an HD5870 as an upgrade. I am STILL Nvidia’s customer, yes I just bought an AMD product, but that doesn’t change that I am ALSO Nvidia’s customer. So, why punish me by deactivating PhysX? And who gives Nvidia’s the right to have an opinion about what would be my next upgrade?
Another example. What would you and everybody else possibly, say if for example Seagate was saying that “For our own reasons if a WD hard disk is primary in the system all Seagate hard disks will be running in IDE compatibility mode”?
It is not very different with what Nvidia is doing today. They don’t let you even try PhysX as a beta when there is an AMD card as a primary card. If AMD doesn’t play nice then just don’t test PhysX with AMD cards. Keep saying that you don’t support it and that for any problems with AMD hardware you don’t accept any responsibility. Don’t lock it, give it without support.
What Nvidia does shows that Nvidia does not respect it’s own customers.
Then don’t buy their shit.
Then don’t buy their shit.
Haha… yeah, I wasn’t
Haha… yeah, I wasn’t expecting to write more than a post here.
“AMD was always incapable to
“AMD was always incapable to promote anything, with the exception of X86-64, so no wonder Bullet Physics are nowhere to be seen. When Nvidia bought Ageia I was happy, because I knew that they had the potential to do something good with it. Unfortunately they locked it.”
Developing and supporting software is NOT free. From quick google search, Nvidia did say they would license it to AMD if they were serious about it but AMD never came forward for. Which Since AMD was one that refused it, any nvidia locked AMD argue-meant outta physx is invalid cause it was amd that refused to license it. I think Ageia wouldn’t survived if Nvidia didn’t buy it. The card was kinda just an extra item, plus barly any games even had it before nvidia bought it.
that’s the problem. even if
that’s the problem. even if nvidia said so people will still blaming nvidia because physx is nvidia software. hence it is nvidia responsible so the feature are working even with amd as a primary card. so they locked physx and force people to deal with hacked driver to use physx. with hacked driver they can’t blame nvidia if anything wrong happen.
I guess I’ll stay away from
I guess I’ll stay away from these drivers, they bumped my GTX 780 idle VDDC to 1.4V.. Good luck trying to stay in power target with that, not to mention keeping it cool.
i have GTX780 ACX, no problem
i have GTX780 ACX, no problem like that here. got 0.85 volts idle and 1.112 volts under load @ 1100mhz