During the course of our review of the new Sapphire Nitro R9 390 8GB card earlier this week, a question came up on driver support. For testing the R9 300-series as well as the Fury X cards, AMD provided a new Catalyst 15.15 beta driver. The problem is that these drivers would not install on the Radeon R9 200-series cards. That's not totally uncommon on new GPU releases but it does seem a bit odd considering the similarities between the R9 390 and the R9 290, for example.
That meant that in our review we had to use the Catalyst 15.5 beta for the Radeon R9 290X and the Radeon R9 290 GPU while using the newer Catalyst 15.15 beta for the Sapphire Nitro R9 390. Eyebrows were raised as you would expect as any performance differences between the new cards and the old cards would have to take into account the driver changes as well. But since we couldn't install the new driver on the old hardware, we were stuck, and published what we had.
Since then, a driver with some INI modifications that allows Catalyst 15.15 to be installed on Radeon R9 290X/290 hardware was built and uploaded from the Guru3D Forums. Today I installed that on our XFX Radeon R9 290 4GB card used in our R9 390 review to re-run a few game tests to see what changes we saw, if any. This would help us address any concerns over the updated driver causing performance changes rather than the hardware changes.
(Note: I realize that using an INI hacked driver isn't exactly going to pass QA with AMD, but I think we are seeing results that are close enough.)
First up, let's look at Grand Theft Auto V.
In GTA V we see that the average frame rate at 2560×1440 goes from 39.5 FPS to 40.5 FPS, an increase of about 2-3%. That's minimal but it is interesting to see how the frame rate consistency changes as we move down the sliding scale; pay attention to the orange and pink lines in the FPS by Percentile graph to see what I am referencing. As you move into the slower frame times in our testing, the gap between the 15.5 and 15.15 driver begins to widen slightly, indicating a little more frame time consistency in 15.15 release.
But what about BF4 or Metro: Last Light?
Continue reading our performance check on Catalyst 15.5 and Catalyst 15.15 drivers!
In Battlefield 4 there is no difference in performance on the R9 290 4GB card when moving from the 15.5 driver to the 15.15.
The same is true for Metro: Last Light – performance is essentially identical between the new and older Catalyst beta driver on the Radeon R9 290.
So what's the take away? While in Grand Theft Auto V there is some performance delta between Catalyst 15.5 and Catalyst 15.15 with the R9 290, that is not the case in Battlefield 4 or Metro: Last Light. (Note: the results at 4K testing, which I did run, showed identical behavior to the 2560×1440 testing shown above.) It makes sense that GTA V would see some improvement in gaming experience with a newer driver as it is still a new title getting updates and fixes from both AMD and NVIDIA. BF4 and Metro: LL are quite a bit longer in the tooth and thus the lack of change is expected.
But, there does not appear to be any kind of smoking gun to point to that would indicate AMD was purposefully attempting to improve its stance through driver manipulation. And that's all I wanted to make sure of with this testing today and this story. I obviously didn't test every game that users are playing today, but all indications are that AMD is in the clear.
Bleh. I guess it’s what
Bleh. I guess it’s what everybody expected. I sincerely hope the FuryX delivers.
Dial core clock and memory to
Dial core clock and memory to the same speed and you get same performance tested!
Which means absolutely 0
Which means absolutely 0 improvement liar AMD just increase clock and change higher RAM module and called works
Nvidia LIED about the 3.5 GB
Nvidia LIED about the 3.5 GB 970.
Nvidia LIED about bumpgate.
You were saying?
That you are charlie’s sock
That you are charlie’s sock puppet.
You are one to talk, as you
You are one to talk, as you also have a big arm shoved up your tuchus, look it moves its fingers and your sock mouth moves!
i dont get what are you upset
i dont get what are you upset about ?
there is improvement, many other tests suggest that, 390 even in these tests is better than 290, higher results shows on games using heavy tesselations like witcher 3 ( basicaly gameworks or more specificaly hairworks) where 300 gets 10% boost on other sites, so AMD didnt lie, 300 got faster tesselation, got slightly more compute and 50mhz clock bump, and double the memory.
what PCper test lacks is a gamework title using heavy tesselation , like the witcher 3 or crysis 3.
PCgameshardware tested the
PCgameshardware tested the 15.5 and 15.15 drivers with a 290X to look specifically at tessellation differences and found that the 15.15 drivers improved 290X tessellation noticeably. That could explain why the 390 cards do better on Gameworks and tessellation titles.
http://www.pcgameshardware.de/AMD-Radeon-Grafikkarte-255597/Specials/Radeon-R9-390X-Test-1162303/
and look at that R9 285 now
and look at that R9 285 now bitch-slapping them all upwards of x16 tessellation, with that “3xx series” beta driver.
so, this should probably explain the 40% performance improvement for 390X (15.15 beta) over 290X (15.5 beta) in Withcer 3 seen at HardOCP:
http://www.hardocp.com/article/2015/06/18/msi_r9_390x_gaming_8g_video_card_review/3#.VYYls_mqpFx
R9 285 has a newer chip
R9 285 has a newer chip design than R9 390X, Tonga (GCN gen3) vs Hawaii (GCN gen2). Tonga has a more powerful geometric engine than Hawaii. It always had more tessellation performance. It doesn’t have much to do with drivers.
right, although it does
right, although it does benefit the most (by quite a big margin) from the new driver, according to PCGH tessellation graph.
You are clueless. You get
You are clueless. You get more performance, lower temps, less noise and more VRAM than a reference R9 390. It’s actually a quite different product. No one cares about running it at the same clocks because no customer will do so. The only thing that hasn’t changed is the basic chip design. But there’s nothing wrong with it. Nvidia did the same in the past, e.g. GTX 680 -> GTX 770. It’s more important for AMD to focus on the 14/16nm next generation. Fiji is more than enough for now.
Not really, they haven’t even
Not really, they haven’t even changed the VRAM module. They simply overclocked them from 5 GHz to 6 GHz, I’ve seen photos posted around and the modules they are using are still rated for 5 GHz and they overclocked them. Thus making buyers have even less overclock potential.
Nvidia instead uses 7 GHz modules and we can overclock them above 8 GHz, I wonder why AMD doesn’t use those? Guess they cost more and they want to lower production costs, nothing else comes to mind otherwise.
All AMD had to do is say it
All AMD had to do is say it was a bug like Nvidia did with Kepler cards and everyone would have believed them right ?
When are we going to see the Kepler test.
When Nvidia did it it was
When Nvidia did it it was perfectly okay and everyone believed them.
If AMD had done that there would have been OUTRAGE. Mostly from Nvidia fans.
Well when you have the best
Well when you have the best product people give you more leeway…Right? I don’t think there’s as much fanboyism as many seem to believe.
Yeah, no one has seemingly
Yeah, no one has seemingly came to defense of Kepler owners whose cards seem to be getting slower by the year. 780 was supposed to compete with a 290X, but its lagging behind even the 280X in latest tests.
Because optimizations for
Because optimizations for Kepler were not yet present. First is support for current tech and then support for previous gen.
Old trade-off between complexity/efficiency. Either you have complex massive hardware which can somewhat handle new situations, but is not that efficient for particular case or you build highly efficient relatively simple hardware and then HW is dependent on software to handle new situations.
Also there was never provided evidence for alleged crippling by Nvidia. Just whole lot of noise and stupidity from ignorant.
I think many Nvidia users
I think many Nvidia users using 700 series cards, would love to see a driver comparison like the above, comparing a few of the latest GeForce drivers and how they perform with 700 series cards. For example if there is a performance decrease, or if the problems with some GameWorks titles like Witcher 3, where fixed as promised.
How about you use your
How about you use your Keppler card, do the tests and share your results?
You do own a Keppler card, right?
Oh! That rage!
I am guessing
Oh! That rage!
I am guessing 12-15 years old. Am I right?
So, no, you don’t own a
So, no, you don’t own a Keppler GPU, or no, you’re too lazy to run a few tests yourself, but not lazy enough to ask someone else to run them for you?
lol, it’s literally PCpers
lol, it’s literally PCpers business to run test…literally. I own a GTX780 and wouldn’t run a test for any of you, even if it would save your life.
So, go buy one and you run the test, or are you too lazy? Waffles
“lol, it’s literally PCpers
“lol, it’s literally PCpers business to run test…literally.”
Don’t think I agree, but I’m sure if JohnGR contracts them to run tests for him, they’ll do just that.
“I own a GTX780”
Good on you, mate!
“and wouldn’t run a test for any of you, even if it would save your life.”
I’m sure JohnGR is going to be devastated by your disregard for his life.
“So, go buy one and you run the test, or are you too lazy?”
My point exactly.
“Waffles”
Yummy!
skirts don’t look good on
skirts don’t look good on men.
im not here for johnGR’s health or yours for that matter.
Driver manipulations seems
Driver manipulations seems more like an Nvidia thing, but still, thanks for checking and being thorough.
Nope. (Apart from title
Nope. (Apart from title specific fun, which both AMD and NVidia had, I never saw any evidence of such thing)
then you have not looked well
then you have not looked well enough…they have both been caught.
i just like to say that there
i just like to say that there is another particular website…and ill be honest… they have been slacking severely lately. i don’t even go there anymore because they are slow with articles or no articles at all. not sure whats going on over there. its pretty pathetic for a top tier tech website.
PCper has been on point with many reviews and tests and multiple test like this one…feeding my need for PC related tech stuff at this ever so fast industry. keep up the great work. you guys have come along way and pretty much IMO the best tech website there is now.
NVIDIA NOT HAVE HDMI 2.0
”
NVIDIA NOT HAVE HDMI 2.0
” image quality mode that allows for a 4K@60Hz signal to fit within HDMI 1.4’s 8.16Gbps bandwidth limit. ”
no THIS card not have HDMI 2.0
Without appropriate Chip .
they not have SiI9777 Chip
http://www.siliconimage.com/Company/…with_HDCP_2_2/
to speak from video card to screen display or Blu-ray is a need to Chip in any device
the chip need Protocol of the HDMI 2.0
ONLY Silicon Image make that chip for HDMI Organization
nvidia 960/970/980 not have SiI9777 Chip
no hdcp2.2
no HDR
no bandwidth 18GBPS
no DCI-P3 COLOER ?
this the card 980TI where the SiI9777 Chip ?
https://pcper.com/files/imagecach…VBA9_1_4_l.jpg
Quote:
This is very easy to prove with a simple detailed image for 4:4:4 and only a blind man cannot tell the different between 30 and 60hz on a PC.
yes this very easy to prove
ask in written confirmation from NVIDIA video card that thay have :
SiI9777 Chip on video card
hdcp2.2
HDR
andwidth 18GBPS
DCI-P3 COLOER
i try 3 time from nvidia and no reply !
what they have to hide ?
Quote:
player002
you bringing article on Sony’s XBR 55X900A this tv make in 2013 when he was only HDMI 1.4
with out chip AND the can play with HDMI 1.4 only 30HZ 4k
so how it can play HDMI 2.0 how it can have HDCP 2.2 ? how it can have 18 gbps ?
Can you connect this screen to NETFLIX ?
:
http://www.anandtech.com/show/8191/n…upport-kind-of
this TV was manufactured before the chip SiI9777 was invented in 2014
http://www.siliconimage.com/Company/…with_HDCP_2_2/
Quote:
Lacking the available bandwidth to fully support 4K@60Hz until the arrival of HDMI 2.0, the latest crop of 4K TVs such as the Sony XBR 55X900A and Samsung UE40HU6900 have implemented what amounts to a lower image quality mode that allows for a 4K@60Hz signal to fit within HDMI 1.4’s 8.16Gbps bandwidth limit.
http://www.anandtech.com/show/8191/nvidia-kepler-cards-get-hdmi-4k60hz-support-kind-of
Maxwell, son, Maxwell.
Maxwell, son, Maxwell.
http://oi57.tinypic.com/qy7gc
http://oi57.tinypic.com/qy7gcm.jpg
you have attach a letter from AMD to have HDMI 2.0
Unlike NVIDIA fake!
ASK NVIDIA IF THAY HAVE :
HDCP 2.2 ?
DCI – P3 coloer ?
SiI9777 Chip ?
HDR ?
bandwidth 18GBPS ?
reviews without checking the truth no reviews!
WTF are you talking about?
WTF are you talking about? That image you posted talks about an ADAPTER from DP 1.2a to HDMI 2.0. Nowjere does it say the AMD card has HDMI 2.0 support natively.
This same adapter can be used with ANY card supporting DP 1.2.
He’s right on one point
He’s right on one point though: Maxwell can’t do HDCP 2.2(except for GM206/GTX 960).
no
no HDCP 2.2 on GTX
no
no HDCP 2.2 on GTX 960/970/980
cant play Ultra Blue Ray or NETFLIX
adapter can be used on DP 1.2a ! no DP 1.2
Write clear 1.2a
MSI is saying their 380, 390
MSI is saying their 380, 390 and 390X all have HDMI 2.0
AMD admits at HDMI 1.4 bandwidth
http://oi57.tinypic.com/27wsoxt.jpg
Soon
Dear Dave,
Your service request : SR #{ticketno:[]} has been reviewed and updated.
Response and Service Request History:
Active DisplayPort 1.2a-to-HDMI 2.0 adapters are set to debut this summer. These adapters will enable any graphics cards with DP1.2a outputs to deliver 4K@60Hz gaming on UHD televisions that support HDMI 2.0.
Fury X is an enthusiast graphics card designed to provide multi-display 4K gaming at 60Hz.
In order to update this service request, please respond, leaving the service request reference intact.
Best regards,
AMD Global Customer Care
But, there does not appear to
But, there does not appear to be any kind of smoking gun to point to that would indicate AMD was purposefully attempting to improve its stance through driver manipulation. And that’s all I wanted to make sure of with this testing today and this story. I obviously didn’t test every game that users are playing today, but all indications are that AMD is in the clear.A lot of dissappointed Kepler users out there who feel their performance is being manipulated negatively through drivers, are pcper going to look into that to ?
kepler are not negatively
kepler are not negatively manipulated by Nvidia.
they release New Lineup, they work on it, ppl who want’s better drivers need to upgrade.
i think there is a good angle to AMD’s rebrands, users got to keep driver support longer 😀 on their cards, since it take them 4 years to phase them out xD.
no but seriously most benchs put 290X way ahead of the 780Ti and Titan, funny.
The answer is no, they will
The answer is no, they will not look into nvidia’s latest maxwell drivers causing Kepler issues.
Posit your own reason.
Quote: During the course of
Quote: During the course of our review of the new Sapphire Nitro R9 390 8GB card earlier this week, a question came up on driver support.
The unending AMD fanboy’s whining of a great conspiracy gets sugar coated into “a question”.
I don’t think AMD has
I don’t think AMD has purposefully tried to nerf 200 series benchmarks, but based on my testing there is variance. This is a different driver where the performance increase varies from nothing (BF4) to margin of error stuff (GTA) to… well, something more.
I could be completely wrong of course, but as a friendly recommendation to PCPer, I’d suggest trying AC Unity at ultra high 1440p with FXAA on both drivers. I see a 15% increase on 15.15 vs 15.5 beta on my R9 290X. A lot of the stutter seems considerably reduced.
Haven’t got the numbers to hand but Far Cry 4 is also worth looking at.
Well AMD gave the market its
Well AMD gave the market its new Fury, and spent its limited engineering budget on getting a new competitive performance SKU out there, and performing with 4GB of memory. AMD does not have the revenues to rework its entire product stack top to bottom in a single stroke. AMD has offered some improvements on its older microarchitectures, and it will take some time for the Fury technology to work its way down the product stack. Give AMD time to compete with the limited funds/resources that they have both with GPU, and CPU engineering, AMD is doing a great job with the little R&D budgets that they have! If there were a year to year innovation for the dollar spent on R&D price metric, on what was done with the budgets available, AMD would be the winner in that category. AMD does more engineering hanging on by a thread just to stay in business, than anyone in the history of the PC market.
Fury is to market, and Zen is incoming, there are some Carrizo based laptops being offered, they better get some better screen resolution options, or we’ll know the fix was in from a certain CPU/SOC monopoly! AMD’s pricing for its Fury line is definitely more competitive for damn sure, and for the consumer that is one good thing.
P.S. Blender 3d is getting support for the latest AMD graphics GCN hardware, so Cycles rendering on the GPU support will make a Carrizo based laptops more attractive for Blender rendering on AMD APUs. I know from a mesh modeling standpoint that AMD’s, and Nvidia’s, GPUs handle high resolution mesh models/scenes much better than Intel’s limited SP/other execution unit count GPU/SOC SKUs. So I’m going to be looking at laptops with Carrizo APUs, and even laptops with Carrizo APUs combined with an AMD discrete mobile GPUs, as once sufficient Blender rendering can be done on the GPU(Ray Tracing included) there will be no need for an expensive quad core i7 based laptop for Blender 3d laptop based usage and medium graphics workloads.
Ryan, thank you for the
Ryan, thank you for the testing.
DRIVERS DRIVERS DRIVERS
DRIVERS DRIVERS DRIVERS
GTX 970 EVGA or R9 390
GTX 970 EVGA or R9 390 MSI??????
GTX 970 EVGA or R9 390 MSI
GTX 970 EVGA or R9 390 MSI