Pricing and Testing Process
As you might have seen on the first page, the new GeForce GTX 780 3GB will retail starting at $649. I will discuss the implications of this pricing structure on our last page, but let's quickly see the comparison at play.
- NVIDIA GeForce GTX 780 3GB – $649
- NVIDIA GeForce GTX 680 2GB – $439
- AMD Radeon HD 7990 6GB – $999
- AMD Radeon HD 7970 3GB GHz Edition – $449
- NVIDIA GeForce GTX TITAN 6GB – $999
At $650, the new GTX 780 will cost you about $210 more than the GTX 680 and $220 more than the AMD Radeon HD 7970 GHz Edition. However, it find itself $350 less than the GTX Titan. Interesting company…
Testing Configuration
The specifications for our testing system haven't changed much.
Test System Setup | |
CPU | Intel Core i7-3960X Sandy Bridge-E |
Motherboard | ASUS P9X79 Deluxe |
Memory | Corsair Dominator DDR3-1600 16GB |
Hard Drive | OCZ Agility 4 256GB SSD |
Sound Card | On-board |
Graphics Card |
NVIDIA GeForce GTX 780 3GB AMD Radeon HD 7970 GHz 3GB NVIDIA GeForce GTX TITAN 6GB NVIDIA GeForce GTX 680 2GB |
Graphics Drivers |
AMD: 13.5 beta NVIDIA: 320.18 |
Power Supply | Corsair AX1200i |
Operating System | Windows 8 Pro x64 |
What you should be watching for
- GTX 780 vs GTX 680 – How much faster does the new GK110 iteration perform when compared to the GTX 680 that used to be its namesake?
- GTX 780 vs GTX Titan – How much SLOWER is the GTX 780 when compared to the GTX Titan that will cost you a cool $350 more?
- GTX 780 vs HD 7970 GHz Edition – Obviously the HD 7970 GHz Edition will be outmanned here but by how much?
Frame Rating: Our Testing Process
If you aren't familiar with it, you should probably do a little research into our testing methodology as it is quite different than others you may see online. Rather than using FRAPS to measure frame rates or frame times, we are using an secondary PC to capture the output from the tested graphics card directly and then use post processing on the resulting video to determine frame rates, frame times, frame variance and much more.
This amount of data can be pretty confusing if you attempting to read it without proper background, but I strongly believe that the results we present paint a much more thorough picture of performance than other options. So please, read up on the full discussion about our Frame Rating methods before moving forward!!
While there are literally dozens of file created for each “run” of benchmarks, there are several resulting graphs that FCAT produces, as well as several more that we are generating with additional code of our own.
If you don't need the example graphs and explanations below, you can jump straight to the benchmark results now!!
The PCPER FRAPS File
While the graphs above are produced by the default version of the scripts from NVIDIA, I have modified and added to them in a few ways to produce additional data for our readers. The first file shows a sub-set of the data from the RUN file above, the average frame rate over time as defined by FRAPS, though we are combining all of the GPUs we are comparing into a single graph. This will basically emulate the data we have been showing you for the past several years.
The PCPER Observed FPS File
This graph takes a different subset of data points and plots them similarly to the FRAPS file above, but this time we are look at the “observed” average frame rates, shown previously as the blue bars in the RUN file above. This takes out the dropped and runts frames, giving you the performance metrics that actually matter – how many frames are being shown to the gamer to improve the animation sequences.
As you’ll see in our full results on the coming pages, seeing a big difference between the FRAPS FPS graphic and the Observed FPS will indicate cases where it is likely the gamer is not getting the full benefit of the hardware investment in their PC.
The PLOT File
The primary file that is generated from the extracted data is a plot of calculated frame times including runts. The numbers here represent the amount of time that frames appear on the screen for the user, a “thinner” line across the time span represents frame times that are consistent and thus should produce the smoothest animation to the gamer. A “wider” line or one with a lot of peaks and valleys indicates a lot more variance and is likely caused by a lot of runts being displayed.
The RUN File
While the two graphs above show combined results for a set of cards being compared, the RUN file will show you the results from a single card on that particular result. It is in this graph that you can see interesting data about runts, drops, average frame rate and the actual frame rate of your gaming experience.
For tests that show no runts or drops, the data is pretty clean. This is the standard frame rate per second over a span of time graph that has become the standard for performance evaluation on graphics cards.
A test that does have runts and drops will look much different. The black bar labeled FRAPS indicates the average frame rate over time that traditional testing would show if you counted the drops and runts in the equation – as FRAPS FPS measurement does. Any area in red is a dropped frame – the wider the amount of red you see, the more colored bars from our overlay were missing in the captured video file, indicating the gamer never saw those frames in any form.
The wide yellow area is the representation of runts, the thin bands of color in our captured video, that we have determined do not add to the animation of the image on the screen. The larger the area of yellow the more often those runts are appearing.
Finally, the blue line is the measured FPS over each second after removing the runts and drops. We are going to be calling this metric the “observed frame rate” as it measures the actual speed of the animation that the gamer experiences.
The PERcentile File
Scott introduced the idea of frame time percentiles months ago but now that we have some different data using direct capture as opposed to FRAPS, the results might be even more telling. In this case, FCAT is showing percentiles not by frame time but instead by instantaneous FPS. This will tell you the minimum frame rate that will appear on the screen at any given percent of time during our benchmark run. The 50th percentile should be very close to the average total frame rate of the benchmark but as we creep closer to the 100% we see how the frame rate will be affected.
The closer this line is to being perfectly flat the better as that would mean we are running at a constant frame rate the entire time. A steep decline on the right hand side tells us that frame times are varying more and more frequently and might indicate potential stutter in the animation.
The PCPER Frame Time Variance File
Of all the data we are presenting, this is probably the one that needs the most discussion. In an attempt to create a new metric for gaming and graphics performance, I wanted to try to find a way to define stutter based on the data sets we had collected. As I mentioned earlier, we can define a single stutter as a variance level between t_game and t_display. This variance can be introduced in t_game, t_display, or on both levels. Since we can currently only reliably test the t_display rate, how can we create a definition of stutter that makes sense and that can be applied across multiple games and platforms?
We define a single frame variance as the difference between the current frame time and the previous frame time – how consistent the two frames presented to the gamer. However, as I found in my testing plotting the value of this frame variance is nearly a perfect match to the data presented by the minimum FPS (PER) file created by FCAT. To be more specific, stutter is only perceived when there is a break from the previous animation frame rates.
Our current running theory for a stutter evaluation is this: find the current frame time variance by comparing the current frame time to the running average of the frame times of the previous 20 frames. Then, by sorting these frame times and plotting them in a percentile form we can get an interesting look at potential stutter. Comparing the frame times to a running average rather than just to the previous frame should prevent potential problems from legitimate performance peaks or valleys found when moving from a highly compute intensive scene to a lower one.
While we are still trying to figure out if this is the best way to visualize stutter in a game, we have seen enough evidence in our game play testing and by comparing the above graphic to other data generated through our Frame rating system to be reasonably confident in our assertions. So much in fact that I am going to going this data the PCPER ISU, which beer fans will appreciate the acronym of International Stutter Units.
To compare these results you want to see a line that is as close the 0ms mark as possible indicating very little frame rate variance when compared to a running average of previous frames. There will be some inevitable incline as we reach the 90+ percentile but that is expected with any game play sequence that varies from scene to scene. What we do not want to see is a sharper line up that would indicate higher frame variance (ISU) and could be an indication that the game sees microstuttering and hitching problems.
Yay for Mini-Titan! looking
Yay for Mini-Titan! looking forward to live stream!
My golly AMD crossfire is
My golly AMD crossfire is still so screwed up AMD cheats so badly who can trust such a scamming con artist no wonder they need to bundle 3 or 4 games with their videocards to even sell a few of them.
Worse yet poor and blind AMD fanboys invade nearly every forum spewing an endless stream of lies and whining about nVidia prices when they can’t even buy a midrange AMD gpu, which explains why they go insane squealing about prices.
Get a freakin paper route or mow a few lawns AMD crybabies.
No AMD gpu’s are not good they are total crap compared to nVidia and nVidia’s massive software advantages and newly integrated game settings and upcoming streaming video to the handheld and PhysXand stable drivers and frame rate target and FXAA and far superior SLI and you name it.
When you still get the AMD corner mouse cursor bug and GSOD’s a unique AMD only sad crash I’ve had to put up with far too many times, WHY is the question.
I’ve had to waste about 20 days of my life helping idiots who bought AMD cards get the stupid things installed and running half crapped then they revert to turdville and the AMD fanboys squeals they didn’t do a thing to destroy stability.
OMG I hate them so much.
I should sue AMD for wasting human lives.
I think it’s odd you are
I think it’s odd you are complaining about ranting AMD fan boys but then you go off and become a ranting Nvidia fanboy.
I’m not a fanboy of either – i’m only loyal to the almighty Dollar (or dollar/performance ratio).
That being said, your argument about how bad AMD is seems like you did not read the article. The HD7970 Ghz edition is still the best bang for the buck for a single video card (no crossfire or SLI) and also includes 4 AAA title games. That’s worth a lot to most people that can’t afford $1000+ of video cards.
the 7970 doesn’t even run
the 7970 doesn’t even run half the games properly
How do you have the energy to
How do you have the energy to type out all that hate? I’ve been a loyal nvidia user for three cards, but couldn’t pass up a 7950 for $229 CAN. I was worried I’d regret it, but it’s posting some serious numbers. Overclocked it scored 3360 on the 3dmark11 and averaged 42fps on unengine Heaven. That betters the 670 and almost the 680. Considering those cards are going for $300+ I’d say I got a good deal.
You don’t have to pick one chip for life. You just have to try and find the best deal out there for what you’re willing to spend. I won this round with AMD, maybe next time I’ll go back to Nvidia, who knows?
You need to calm the fuck down and just buy what you want and stop shitting on whatever is competing with what you bought. Life is too short to be so angry.
If I just bought a Titan I’d
If I just bought a Titan I’d be pissed!
Honestly, if that was the
Honestly, if that was the case, then you would have bought the Titan for the wrong reasons.
Why the Titan is way more
Why the Titan is way more powerful, and dollar per dollar gets more bang for the buck!
I don’t like the price but it
I don’t like the price but it feels like they are forced @650 because the titan is 1000 and next years 880 will be more powerful then the titan, which will still cost $1000. I think the ball is in AMDs court right now and I think Nvidia won’t budge on price on anything unless AMD steps up big time.
You are exactly right. And
You are exactly right. And this is exactly what happened in the past with AMD’s video cards. There was a gap between the GTX 580 and GTX 680 when AMD had the HD6970 (or for those smart people, the unlocked HD6950) and for a while AMD’s prices were way too high because Nvidia had nothing to offer.
I was really ticked when the
I was really ticked when the Titan was released as I had just bought a factory OC GTX 680 4GB. I would have sunk that money into the Titan instead. But… it’s a good thing I didn’t sink my $ into a Titan. With the federal furlough going on until the end of the year, I’m having to do some belt tightening. Hopefully with the new fiscal year everything will be back to normal. I’ll probably wait until November and unload my GTX 680 for about 30%-40% off of what I paid for it and re-invest my money into a factory OC GTX 780. Thank you for the news release, I knew it was coming but its nice to see the performance numbers as well! Great job on the review!
Good review guys. Looking
Good review guys. Looking forward to your review video!
I’m confused about which
I’m confused about which drivers you were using for what. According to your test system page its 320.18, but you make continuous references that the data from the not-780 cards is being done using older drivers.
If that is the case you should specify which cards are using which drivers, for clarity if nothing else (were they just 320.14?).
Really, I just keep looking at the FC3 graph were the 780 is being shown outperforming the TITAN and going “wtf?”.
In FC3, that’s correct – and
In FC3, that's correct – and only because this latest beta driver we were given has special improvements for that game. We didn't have a chance to test Titan or 680 before publication.
Great review. Can we expect
Great review. Can we expect some Tri-SLI numbers from you guys?
Only about 3 hrs. until we se
Only about 3 hrs. until we se Ryan !
It`s getting to the point
It`s getting to the point that we may need the GPU`s to be in a separate box with a dedicated power supply/fans , eh ?
Nvidiaaaaaa.. What is with
Nvidiaaaaaa.. What is with these outrageous prices. Now when the Maxwell comes out i guess we should be expecting prices in 2K range.. Im losing my grip with Nvidia being a nvidia’s graphics card owner..Sigh AMD I wonder what will be on your side of things in the coming years.
So I’m looking at Skyrim sGPU
So I’m looking at Skyrim sGPU and on the frametime graph I’m seeing those nasty spikes on the geforces that look like going of the charts, and then I look at the percentile charts which don’t seem to reflect these spikes. I would think those are the sort of spikes that are noticable hitches during gameplay, no? Are the percentile charts averaging out the spikes or what? Am I missing something? wouldn’t be the first time lol
It would be nice if you’d add
It would be nice if you’d add a bar graph chart for ease of reading. I skipped to the end after the first benchmark just to see the conclusion. You need min fps at least in a chart (or min+avg), as I couldn’t care less about max which affects nothing for my game. It takes seconds to look at bar graphs, it takes minutes and is frustrating with a bunch of lines. I’m not saying remove those (maybe some like them), but you need a chart I can quickly see who had the best min/avg.
The current way you show the benchmarks is basically a big mess IMHO. I’m not talking about frametimes here, you clearly need lines for that to show. I’m talking the actual fps charts, those need to be done in bar graphs like most other sites do. They are very quick to read and get a quick picture of who is leading X game. You lose hits from people like me every review. Sure I can decipher all the lines, but I don’t have 10 minutes to do it for each chart. Instead I read the first page, the temps/noise page and the conclusion page. The charts are all just a PITA to me 🙁
Don’t get me wrong, I love the site, and when I have time I come back at some point (usually) to read more. It’s just easier to go elsewhere even if I don’t really want to.
I don’t think you understand
I don’t think you understand the graphics for FPS, which are pretty easy to see. It doesn’t show a number, but a chart of where the FPS was at each point in the 60 second benchmark. If you look at the lowest point in the graph, that is the minimum, if you look at the highest point, that is the maximum. This method gives you a clearer picture of what actually is taking place. Especially with minimums. In the bar graph charts, you don’t know if the minimum is a small drop at the load of a game, or if the game gets near that point often. This method makes it clear.
Can you please tell me the
Can you please tell me the exact settings used in your heaven benchmark.
The extreme does not match up with results I’ve seen if thats 1920×1080 w/ 8xAA
I just ordered the 780 and im
I just ordered the 780 and im kind of geeking out waiting for it. Lol just kidding my question is everyone says this card is not good for a single 1920×1080 monitor but what about people like me who have the asus 144hz monitor and want to play games at 120fps or play games in 3d? Wouldnt this card be perfect for that?
Yeah I am with you, currently
Yeah I am with you, currently ordering a 780 but debating whether to get a high res monitor with 60fps or a 1080p with 120, its the last thing I need to complete my build.
Is it faster than all but
Is it faster than all but Titan? Yes. Does it overclock well? Yes, easily going beyond Titan in many games, according to Hardocp. Does it use less power than 7970GE? Is it quieter? Yes. Is it 35% less than Titan? Yes.
Should it be 60 dollars less? Yes, and it will be soon. DO most high end buyers give a %@$#$% about 60 bucks? I doubt it.
I don’t think you use a
I don’t think you use a particularly good benchmark run for Skyrim. It takes place primarily in Whiterun, which is more CPU bound in my experience. A better choice for GPU benchmarking would be the forest areas either around Riften or, preferably, Falkreath. These are the most GPU bound areas of the game.