GPU Comparisons and Testing Setup
Testing Configuration
The specifications for our testing system haven't changed.
Test System Setup | |
CPU | Intel Core i7-3960X Sandy Bridge-E |
Motherboard | ASUS P9X79 Deluxe |
Memory | Corsair Dominator DDR3-1600 16GB |
Hard Drive | OCZ Agility 4 256GB SSD |
Sound Card | On-board |
Graphics Card | AMD Radeon R9 Fury 4GB AMD Radeon R9 Fury X 4GB AMD Radeon R9 290X 4GB NVIDIA GeForce GTX 980 Ti 6GB NVIDIA GeForce GTX 980 4GB |
Graphics Drivers | AMD: 15.7 NVIDIA: 353.30 |
Power Supply | Corsair AX1200i |
Operating System | Windows 8.1 Pro x64 |
What you should be watching for
- Sapphire R9 Fury vs AMD Fury X – How much does the drop from 4096 stream processors to 3584 really affect gaming performance?
- Sapphire R9 Fury vs GeForce GTX 980 – This is the nearest direct competition for the new AMD Fury cards from the NVIDIA lineup. Does the GTX 980 have a chance here even though it sells for $50 less?
- Sapphire R9 Fury CrossFire – How does this new card scale? Is this the best option for enthusiast gamers looking for a multi-GPU solution?
Frame Rating: Our Testing Process
If you aren't familiar with it, you should probably do a little research into our testing methodology as it is quite different than others you may see online. Rather than using FRAPS to measure frame rates or frame times, we are using an secondary PC to capture the output from the tested graphics card directly and then use post processing on the resulting video to determine frame rates, frame times, frame variance and much more.
This amount of data can be pretty confusing if you attempting to read it without proper background, but I strongly believe that the results we present paint a much more thorough picture of performance than other options. So please, read up on the full discussion about our Frame Rating methods before moving forward!!
While there are literally dozens of file created for each “run” of benchmarks, there are several resulting graphs that FCAT produces, as well as several more that we are generating with additional code of our own.
If you don't need the example graphs and explanations below, you can jump straight to the benchmark results now!!
The PCPER FRAPS File
While the graphs above are produced by the default version of the scripts from NVIDIA, I have modified and added to them in a few ways to produce additional data for our readers. The first file shows a sub-set of the data from the RUN file above, the average frame rate over time as defined by FRAPS, though we are combining all of the GPUs we are comparing into a single graph. This will basically emulate the data we have been showing you for the past several years.
The PCPER Observed FPS File
This graph takes a different subset of data points and plots them similarly to the FRAPS file above, but this time we are look at the “observed” average frame rates, shown previously as the blue bars in the RUN file above. This takes out the dropped and runts frames, giving you the performance metrics that actually matter – how many frames are being shown to the gamer to improve the animation sequences.
As you’ll see in our full results on the coming pages, seeing a big difference between the FRAPS FPS graphic and the Observed FPS will indicate cases where it is likely the gamer is not getting the full benefit of the hardware investment in their PC.
The PLOT File
The primary file that is generated from the extracted data is a plot of calculated frame times including runts. The numbers here represent the amount of time that frames appear on the screen for the user, a “thinner” line across the time span represents frame times that are consistent and thus should produce the smoothest animation to the gamer. A “wider” line or one with a lot of peaks and valleys indicates a lot more variance and is likely caused by a lot of runts being displayed.
The RUN File
While the two graphs above show combined results for a set of cards being compared, the RUN file will show you the results from a single card on that particular result. It is in this graph that you can see interesting data about runts, drops, average frame rate and the actual frame rate of your gaming experience.
For tests that show no runts or drops, the data is pretty clean. This is the standard frame rate per second over a span of time graph that has become the standard for performance evaluation on graphics cards.
A test that does have runts and drops will look much different. The black bar labeled FRAPS indicates the average frame rate over time that traditional testing would show if you counted the drops and runts in the equation – as FRAPS FPS measurement does. Any area in red is a dropped frame – the wider the amount of red you see, the more colored bars from our overlay were missing in the captured video file, indicating the gamer never saw those frames in any form.
The wide yellow area is the representation of runts, the thin bands of color in our captured video, that we have determined do not add to the animation of the image on the screen. The larger the area of yellow the more often those runts are appearing.
Finally, the blue line is the measured FPS over each second after removing the runts and drops. We are going to be calling this metric the “observed frame rate” as it measures the actual speed of the animation that the gamer experiences.
The PERcentile File
Scott introduced the idea of frame time percentiles months ago but now that we have some different data using direct capture as opposed to FRAPS, the results might be even more telling. In this case, FCAT is showing percentiles not by frame time but instead by instantaneous FPS. This will tell you the minimum frame rate that will appear on the screen at any given percent of time during our benchmark run. The 50th percentile should be very close to the average total frame rate of the benchmark but as we creep closer to the 100% we see how the frame rate will be affected.
The closer this line is to being perfectly flat the better as that would mean we are running at a constant frame rate the entire time. A steep decline on the right hand side tells us that frame times are varying more and more frequently and might indicate potential stutter in the animation.
The PCPER Frame Time Variance File
Of all the data we are presenting, this is probably the one that needs the most discussion. In an attempt to create a new metric for gaming and graphics performance, I wanted to try to find a way to define stutter based on the data sets we had collected. As I mentioned earlier, we can define a single stutter as a variance level between t_game and t_display. This variance can be introduced in t_game, t_display, or on both levels. Since we can currently only reliably test the t_display rate, how can we create a definition of stutter that makes sense and that can be applied across multiple games and platforms?
We define a single frame variance as the difference between the current frame time and the previous frame time – how consistent the two frames presented to the gamer. However, as I found in my testing plotting the value of this frame variance is nearly a perfect match to the data presented by the minimum FPS (PER) file created by FCAT. To be more specific, stutter is only perceived when there is a break from the previous animation frame rates.
Our current running theory for a stutter evaluation is this: find the current frame time variance by comparing the current frame time to the running average of the frame times of the previous 20 frames. Then, by sorting these frame times and plotting them in a percentile form we can get an interesting look at potential stutter. Comparing the frame times to a running average rather than just to the previous frame should prevent potential problems from legitimate performance peaks or valleys found when moving from a highly compute intensive scene to a lower one.
While we are still trying to figure out if this is the best way to visualize stutter in a game, we have seen enough evidence in our game play testing and by comparing the above graphic to other data generated through our Frame rating system to be reasonably confident in our assertions. So much in fact that I am going to going this data the PCPER ISU, which beer fans will appreciate the acronym of International Stutter Units.
To compare these results you want to see a line that is as close the 0ms mark as possible indicating very little frame rate variance when compared to a running average of previous frames. There will be some inevitable incline as we reach the 90+ percentile but that is expected with any game play sequence that varies from scene to scene. What we do not want to see is a sharper line up that would indicate higher frame variance (ISU) and could be an indication that the game sees microstuttering and hitching problems.
Sapphire always make sexy
Sapphire always make sexy looking cards.
http://images.bit-tech.net/co
http://images.bit-tech.net/content_images/2014/06/sapphire-radeon-r9-290-vapor-x-oc-review/sap290vxoc-1b.jpg
My baby!
these are better than i
these are better than i expected, but at this point im just waiting to see price and performance of the Nano, with a Fiji XT core on something that tiny and using only 175w-ish, thats gotta be clocked super low
Crossfire performance is
Crossfire performance is massive. Nvidia stands no chance with their SLI in many games and the others are close.
I agree that the performance
I agree that the performance is better in CF, but the problem is a majority of people only want ONE card. With that said this does great as a single card.
~AMD fan.
When the Fury X2 comes out
When the Fury X2 comes out I’m sold.
Price/performance is still
Price/performance is still SHIT.
I see no point in buying anything besides a 970 or 290, tbh.
or using two … if you really think you need it.
Compared to the 980, a 10%
Compared to the 980, a 10% increase in price, for a 5%-30% increase in performance is not shit. If you think there is no point for these cards, then you obviously don’t have to buy one. I am debating getting a Fury to replace my 7950 for 1440p gaming.
get two 290s/970s for the
get two 290s/970s for the same price
I can’t be arsed to mess with
I can’t be arsed to mess with Crossfire/SLI, and my case would be a cramped nightmare with two full size cards.
Two 970s are at least $600, and two 290s are $540, so only $10 off of MSRP of the Fury. Do you have any benchmarks to compare, or should I go and find that information too?
Two 290’s would spank this
Two 290’s would spank this card but you are right the complications with power and CF are not worth it to me. I’d rather pay the premium for a top end single GPU.
Could be a pretty good option
Could be a pretty good option with frame rate control being available in the new 5.7 drivers. Couple with a Freesync monitor and you’ve got butter smooth gaming for the next 2 years.
That is an interesting
That is an interesting prospect. the hawaii GPU’s get oddly efficient when downclocked to 900mhz. So running CF with my MG279Q could yield a very nice upgrade for a couple years at around $240 for the 290.
Any chance to see the GTA 5
Any chance to see the GTA 5 bench marks?
“NVIDIA still has a fighting
“NVIDIA still has a fighting chance with the GeForce GTX 980 of course: it is $50 less expensive, includes a free game (currently at least) and has the advantage of more frequent and usually more reliable drivers behind it. But AMD has stepped up its game and released a high-end GPU that should make NVIDIA worry. And that’s great for everyone.”
While I agree that Nvidia has had more frequent driver releases, I don’t know if the claim that their drivers are more stable can be substantiated.
AMD drivers will also improve
AMD drivers will also improve with time while Nvidia drivers are not likely to squeeze out better performance since it’s been out for several months.
AMD fans always say this and
AMD fans always say this and history has shown it just really doesn’t happen.
That’s false. GCN used to be
That’s false. GCN used to be lower than Kepler. Right now, GCN has overtaken Kepler in performance. History has shown that like with all new GPU archs, performance does increase with time as driver matures.
Well, for now that has
Well, for now that has changed. The fury dropped to 499 in price on newegg (in the US) and it includes star wars battlefront.
fix the jump link to the
fix the jump link to the benchmarks page in page# 3, it is directing to AMD’s 7990 benchmarks!
Well, due to the fact you
Well, due to the fact you guys don't benchmark using resolutions that 90% of people still use.. Your benchmarks just got much less useful to me.
Strangely Anandtech did the
Strangely Anandtech did the same thing. Maybe because they were rushing to get it out first? I am finally on a 1440p monitor so I am satisfied, but overall Fury X struggled at 1080p compared to 980ti so I would be curious how this does.
I am hesistant to boost other
I am hesistant to boost other sites, but since PCPER has no 1080p testing anymore I can say that Sweclockers do.
And on 1080p, this card gets beaten by 980 by a few percentage points consistently, and that’s without any OC’ing. And we all know about Maxwell’s massive overclocking capability.
So at 1440p or above, the Fury is a great card. At 1080p it gets beaten and as you said, that is where most people still are (and will remain for the foreseeable future). Maxing out Witcher 3 on ultra at 1080p means that even the 980 Ti dips below 60 fps.
And there still is no card that is 4K-ready. Look at GTA V. Even Big Pascal probably won’t breach 60 fps on ultra at 4K on GTA V.
why would they include 1080p
why would they include 1080p benchmarks this cards arent meant for 1080p player it just wouldnt make sense to have 500+ card and 100 dollar monitor.i think pcper is targeting the right audince with this benchamarks
1080 is important for the
1080 is important for the 120144hz gamers still.
How is it important when it
How is it important when it can play all games smoothly at 1080p? Same with the GTX 980. It’s pointless who wins at that resolution when both can do the job.
“Smoothly” isn’t 60 fps. It’s
“Smoothly” isn’t 60 fps. It’s 144 fps. If you disagree your eyes either suck or you have never tried true 144 fps gaming.
P.S. Witcher 3 at ultra on 1080p can’t even hit 60 fps on the mighty 980 Ti on most benchmarks I’ve seen, so forget hitting 100+ fps.
If AMD isn’t 1080p on its high-end cards its not relevant at all.
Asynchronous monitors like
Asynchronous monitors like GSYNC are smoother at lower frame rates so keep that in mind for high-end gaming.
Aiming for 144FPS with good quality settings is always going to be problematic… cards will get better but games will also get more demanding.
For 1080p benchmarks use Techpowerup, though I don’t quite agree with the scores since you should use less AA at higher resolutions to have a more apples-to-apples comparision of visuals. Roughly speaking 30 to 40% higher frame rates is what I’d estimate (i.e. same settings except 8xMSAA 1080p vs 4xMSAA 1440).
http://www.techpowerup.com/reviews/MSI/GTX_980_Ti_Gaming/24.html
Agreed, 1080p is still very
Agreed, 1080p is still very relevant and it’s sad they don’t test it anymore.
For those saying high end cards like this aren’t meant for 1080p I couldn’t disagree any more. If you like to crank up the eye candy with loads of AA 1080p testing is relevant.
Before anyone starts saying “get a higher resolution panel” a lot of people game in their living room. I have a 1080p projector at 130″ and will never go back to a tiny monitor. And don’t even suggest a 4k one as they are high $$.
Even My 980 sc can be taxed
Even My 980 sc can be taxed in some high end games at 1080p oh, by the way, my 1080p monitor cost a good bit more than $100.00.
1080p is still relevant for me.
Great card but not worth an
Great card but not worth an upgrade from my Sapphire Vapor-X R9 290. I might have to wait for the next iteration…That and the MG279Q I just bought extends longevity even more! 1440p max settings is pretty attainable at 35fps with the 290.
Like the performance, don’t
Like the performance, don’t like the price. And for the cooling solution, there is so much exhaust space at the back, yet this cooler uses nearly next to none of it 🙁
Do wonder who will release a water cooled version or more blower version…
I agree with the price being
I agree with the price being too high for me, but exhaust space at the back…Barely any heat ever comes out of these vents on an aftermarket GPU like this. Especially considering most use vertically aligned heatsink fins. I think keeping the shorter PCB vs what Asus did was a smart move because the fan blows the air right through the heatsink and doesn’t get trapped anywhere but in the case.
I thought the big selling
I thought the big selling point with these Fury cards was they are so small to fit in small cases. So to have to extend the PCB and/or cooling takes away that advantage and just shows are poor heat/power is on AMD cards.
The Sapphire card looks
The Sapphire card looks really dumb with the extended cooling from the PCB.
I think it is a great idea.
I think it is a great idea. straight through fan cooling with no pointless PCB behind it…think again.
I think it is a great idea.
I think it is a great idea. straight through fan cooling with no pointless PCB behind it…think again.
AMD priced this wrong..
AMD priced this wrong.. unless they have very little volume.
At $499 , this card would completely overtake the GTX 980.
It would be even hard for nvidia zealot to get a GTX 980 knowing they paid the same amount of $ for a slower card.
But at $550, AMD just left a huge market for nvidia.
I can only guess this : AMD doesn’t want to anger nvidia.
Because nvidia got like 8 billion in cash in the bank and they can price their card so low that AMD would be loosing money if they tried to compete…
The Fury is a nice card (super quiet, very low temperature, faster then a GTX 980), but a still 50$ overpriced.
Now, if the Fury card was $550 and including all 4096 shaders,
AMD would totally OWN the gaming GPU market… (untill nvidia drop the 980 ti to $500)
AMD, destined to eat the scraps of Intel and Nvidia …
Well only reason its priced
Well only reason its priced as it is was cause 980ti and its price point. AMD was forced to price this lower then they wanted to.
Look at benchmarks between 980ti and fury you can see why they had no choice.
The 290x has recently been
The 290x has recently been restested and is not beating the 980 in most games. Keep an open mind as AMD cards age well as the driver matures. My 7970ghz Crossfire is beating 980 easy.
Why no reviews take on that
Why no reviews take on that IQ issue on nvidias card?
just give fps isn’t enough.
Must put everithing on maximum IQ and found a way to get the same IQ on nvidias card, because is already know that nvidia cripple their cards IQ to get more FPS.
So for us that intend to game 4K and or buy a expensive card must put IQ over FPS.
If the benchmarck is all about FPS then give the video of each test on the review pages for us to see the diference in order to choose fairly.
@Ryan-Shrout
Could you water
@Ryan-Shrout
Could you water cool the Fury, from the ~70c to the 50c of the FuryX?
then Check Wattage of the Fury?
from what we heard before the fury launched it was adding a water cooler to help keep temps low to also help reduce the wattage leakage.
Very good job on the review,
Very good job on the review, liked it 🙂 What it tells me is 4k gaming is going to take a lot of money, SLI or Crossfire, to me it looks like AMD should let it’s vendors/partners make a air cooled Fury X at $599.99 lol
Hi Ryan.
With everyone
Hi Ryan.
With everyone wanting”MOAR”and AMD pushing close to the limit
of this architecture to compete, have you considered trying some under-volting?
Seems the Asus card on a few sites drawing 30 to 50 Watts
less than the Sapphire……………….
I agree. My big thing in the
I agree. My big thing in the past 5+ year has been undervolting and finding a chi sweet spot.
The 290x is a great example, as it shatter all the TDP result we see on reviews.. if you dont mind running it 5% to 10% lower clock.
AMD got great chips, but doesn’t seem to read tech site review to at least make one product that wont get destroyed by the press.
“Its to hot”, “use to much power”, “Not recommended”
Then a few month later an nvidia card comes out that use more power, it hotter, but its receives awards after awards…
No one seem to be looking at price/performance, where AMD ‘used’ to be a clear leader.
With this Fury release, AMD now eliminated their only advantage… Who in the world in running this company to be so out of touch with simple concepts.
I totally agree. I feel like
I totally agree. I feel like NO ONE at AMD really takes advice from normal everyday gamer’s when it comes to GPU’s. AMD’s only advantage was as you said price/performance, which they still have on the 290/290x. These new cards are great performers but the price just isn’t going to work for AMD.
I’m really surprised you
I’m really surprised you didn’t use Witcher 3 for these tests. That game would be one of the best to test the mettle of these latest cards.
I think the obvious question
I think the obvious question is as the sapphire version uses what appears to be a fury x pcb is it possible to unlock the core with a bios flash? More than likely its a fury x with some shaders disabled via a bios, seeing as the card has a dual bios it shouldn’t be that hard to test?