GK110 Gets a Lower Price Point
NVIDIA has a new SKU to offer that takes the GK110 GPU and cuts off some cores but has a hefty $650 price tag.
If you want to ask us some questions about the GTX 780 or our review, join us for a LIVE STREAM at 2pm EDT / 11am PDT on our LIVE page.
When NVIDIA released the GeForce GTX Titan in February there was a kind of collective gasp across the enthusiast base. Half of that intake of air was from people amazed at the performance they were seeing on a single GPU graphics cards powered by the GK110 chip. The other half was from people aghast of the $1000 price point that NVIDIA launched it at. The GTX Titan was the fastest single GPU card in the world, without any debate, but with it came a cost we hadn't seen in some time. Even with the debate between it, the GTX 690 and the HD 7990, the Titan was likely my favorite GPU, cost no concerns.
Today we see the extension of the GK110, by cutting it back some, and releasing a new card. The GeForce GTX 780 3GB is based on the same chip as the GTX Titan but with additional SMX units disabled, a lower CUDA core count and less memory. But as you'll soon see, the performance delta between it and the GTX 680 and Radeon HD 7970 GHz is pretty impressive. The $650 price tag though – maybe not.
We held a live stream the day this review launched at https://pcper.com/live. You can see the replay that goes over our benchmark results and thoughts on the GTX 780 below.
The GeForce GTX 780 – A Cut Down GK110
As I mentioned above, the GTX 780 is a pared-down GK110 GPU and for more information on that particular architecture change, you should really take a look at my original GTX Titan launch article from February. There is a lot more that is different on this part compared to GK104 than simple shader counts, but for gamers most of the focus will rest there.
The chip itself is a 7.1 billion mega-ton beast though a card with the GTX 780 label is actually utilizing much fewer than that. Below you will find a couple of block diagrams that represent the reduced functionality of the GTX 780 versus the GTX Titan:
While the GTX Titan sported 2,688 CUDA cores and 14 SMXs, the GTX 780 will have 12 SMX units for a total CUDA core count of 2,304. NVIDIA has two different options for chip configurations. One is to disable an entire GPC unit of the GK110 chip while the other disables a single SMX in three different GPC. Performance differences between the two should be negligible I'm told.
Clock speeds are actually a bit higher on the GTX 780 running at 863 MHz base and 900 MHz boost. The memory is still running at 6.0 GHz on a 384-bit memory bus but the frame buffer itself drops from 6GB to 3GB.
The only other major hardware change is a drop from 224 to 192 texture units to align with the drop in shader performance and count.
Still, compared to the GTX 680 the new GTX 780 has noticeable performance increases. The GTX 780 has 50% more CUDA cores, a 50% larger memory bus width and 50% more texture units. This is at the expense of clock speed though that is 16% lower on the new GTX 780. Obviously the GTX 780 will have much higher performance than the reference GTX 680 offerings and it will interesting to see how close the GTX 780 is in relation to the GTX Titan.
If you are one of the consumers that follows a 2 year upgrade cycle, NVIDIA claims enthusiasts will see a 70% increase in performance in relation to the GTX 580, Fermi-based flagship.
Even though AMD has been getting all of the spotlight with its Never Settle game bundles, NVIDIA wants to assure gamers that they are doing what matters most – addressing performance issues with fresh drivers on major game releases.
With the public release of the 320 driver series NVIDIA has added some more performance boosts in games like Dirt: Showdown and Tomb Raider (both AMD Gaming Evolved titles) while also increase exposure of the GeForce Experience and improving even single GPU frame time variations.
New to the GFE application with this release (upcoming) is a feature called "ShadowPlay" that will perform real-time hardware accelerated capture of your game play sessions using the GPU-based H.264 encoder. This is will be high quality footage and it will have as much as a 20 minute buffer on it so you can catch any awesome events without having to predict them! This is a pretty kick ass feature and will likely promote a lot of YouTube based video sharing – though more integration with a service like Twitch.tv would be cool too.
Just as we saw with the GTX Titan, GPU Boost gets some additional functionality and flexibility with the GTX 780. This includes better overvolting, display overclocking and higher clocks. They have even enabled a new option for software like Precision X to give you "fail states" for overclocking with GPU Boost 2.0. When you are hitting a voltage limit, you will get an indicator that tells you that is the bottleneck of performance, allowing you to adjust the setting if you can to get more performance. Then if power is the bottleneck, you can adjust the sliders as necessary; all in an attempt to get the highest clocks for your GPU.
With this release NVIDIA is also reworking its fan controller to be more "even toned" in regards to the fan speeds. This will prevent some of the oscillation sound effects you might hear (whirr, whirr) during changing load patterns in games.
The GeForce GTX 780 3GB Graphics Card
Look familiar?
The GeForce GTX 780 shares the same design as the GTX Titan and the GTX 690 before it – a clean silver and black design with the only accent in color the green of the GeForce GTX logo up top. I was a fan of the design originally and I continue to be impressed by it today.
Speaking of fans, the GTX 780 is also sporting the same style cooler that the GTX Titan which means QUIET operation – a big plus for NVIDIA over AMD on these recent reference releases.
The GTX 780 will require a 6-pin and 8-pin power, just like the GTX Titan.
External connections remain the same as well: a pair of dual-link DVI connections, one full size HDMI port and a standard DisplayPort…port.
SLI is supported with up to three GPUs – don't believe reports of quad SLI options!
The GeForce GTX logo up to is LED backlit and is controllable via a few not-so-great pieces of software. If you want pulsing GPUs, NVIDIA is the way to go!
Yay for Mini-Titan! looking
Yay for Mini-Titan! looking forward to live stream!
My golly AMD crossfire is
My golly AMD crossfire is still so screwed up AMD cheats so badly who can trust such a scamming con artist no wonder they need to bundle 3 or 4 games with their videocards to even sell a few of them.
Worse yet poor and blind AMD fanboys invade nearly every forum spewing an endless stream of lies and whining about nVidia prices when they can’t even buy a midrange AMD gpu, which explains why they go insane squealing about prices.
Get a freakin paper route or mow a few lawns AMD crybabies.
No AMD gpu’s are not good they are total crap compared to nVidia and nVidia’s massive software advantages and newly integrated game settings and upcoming streaming video to the handheld and PhysXand stable drivers and frame rate target and FXAA and far superior SLI and you name it.
When you still get the AMD corner mouse cursor bug and GSOD’s a unique AMD only sad crash I’ve had to put up with far too many times, WHY is the question.
I’ve had to waste about 20 days of my life helping idiots who bought AMD cards get the stupid things installed and running half crapped then they revert to turdville and the AMD fanboys squeals they didn’t do a thing to destroy stability.
OMG I hate them so much.
I should sue AMD for wasting human lives.
I think it’s odd you are
I think it’s odd you are complaining about ranting AMD fan boys but then you go off and become a ranting Nvidia fanboy.
I’m not a fanboy of either – i’m only loyal to the almighty Dollar (or dollar/performance ratio).
That being said, your argument about how bad AMD is seems like you did not read the article. The HD7970 Ghz edition is still the best bang for the buck for a single video card (no crossfire or SLI) and also includes 4 AAA title games. That’s worth a lot to most people that can’t afford $1000+ of video cards.
the 7970 doesn’t even run
the 7970 doesn’t even run half the games properly
How do you have the energy to
How do you have the energy to type out all that hate? I’ve been a loyal nvidia user for three cards, but couldn’t pass up a 7950 for $229 CAN. I was worried I’d regret it, but it’s posting some serious numbers. Overclocked it scored 3360 on the 3dmark11 and averaged 42fps on unengine Heaven. That betters the 670 and almost the 680. Considering those cards are going for $300+ I’d say I got a good deal.
You don’t have to pick one chip for life. You just have to try and find the best deal out there for what you’re willing to spend. I won this round with AMD, maybe next time I’ll go back to Nvidia, who knows?
You need to calm the fuck down and just buy what you want and stop shitting on whatever is competing with what you bought. Life is too short to be so angry.
If I just bought a Titan I’d
If I just bought a Titan I’d be pissed!
Honestly, if that was the
Honestly, if that was the case, then you would have bought the Titan for the wrong reasons.
Why the Titan is way more
Why the Titan is way more powerful, and dollar per dollar gets more bang for the buck!
I don’t like the price but it
I don’t like the price but it feels like they are forced @650 because the titan is 1000 and next years 880 will be more powerful then the titan, which will still cost $1000. I think the ball is in AMDs court right now and I think Nvidia won’t budge on price on anything unless AMD steps up big time.
You are exactly right. And
You are exactly right. And this is exactly what happened in the past with AMD’s video cards. There was a gap between the GTX 580 and GTX 680 when AMD had the HD6970 (or for those smart people, the unlocked HD6950) and for a while AMD’s prices were way too high because Nvidia had nothing to offer.
I was really ticked when the
I was really ticked when the Titan was released as I had just bought a factory OC GTX 680 4GB. I would have sunk that money into the Titan instead. But… it’s a good thing I didn’t sink my $ into a Titan. With the federal furlough going on until the end of the year, I’m having to do some belt tightening. Hopefully with the new fiscal year everything will be back to normal. I’ll probably wait until November and unload my GTX 680 for about 30%-40% off of what I paid for it and re-invest my money into a factory OC GTX 780. Thank you for the news release, I knew it was coming but its nice to see the performance numbers as well! Great job on the review!
Good review guys. Looking
Good review guys. Looking forward to your review video!
I’m confused about which
I’m confused about which drivers you were using for what. According to your test system page its 320.18, but you make continuous references that the data from the not-780 cards is being done using older drivers.
If that is the case you should specify which cards are using which drivers, for clarity if nothing else (were they just 320.14?).
Really, I just keep looking at the FC3 graph were the 780 is being shown outperforming the TITAN and going “wtf?”.
In FC3, that’s correct – and
In FC3, that's correct – and only because this latest beta driver we were given has special improvements for that game. We didn't have a chance to test Titan or 680 before publication.
Great review. Can we expect
Great review. Can we expect some Tri-SLI numbers from you guys?
Only about 3 hrs. until we se
Only about 3 hrs. until we se Ryan !
It`s getting to the point
It`s getting to the point that we may need the GPU`s to be in a separate box with a dedicated power supply/fans , eh ?
Nvidiaaaaaa.. What is with
Nvidiaaaaaa.. What is with these outrageous prices. Now when the Maxwell comes out i guess we should be expecting prices in 2K range.. Im losing my grip with Nvidia being a nvidia’s graphics card owner..Sigh AMD I wonder what will be on your side of things in the coming years.
So I’m looking at Skyrim sGPU
So I’m looking at Skyrim sGPU and on the frametime graph I’m seeing those nasty spikes on the geforces that look like going of the charts, and then I look at the percentile charts which don’t seem to reflect these spikes. I would think those are the sort of spikes that are noticable hitches during gameplay, no? Are the percentile charts averaging out the spikes or what? Am I missing something? wouldn’t be the first time lol
It would be nice if you’d add
It would be nice if you’d add a bar graph chart for ease of reading. I skipped to the end after the first benchmark just to see the conclusion. You need min fps at least in a chart (or min+avg), as I couldn’t care less about max which affects nothing for my game. It takes seconds to look at bar graphs, it takes minutes and is frustrating with a bunch of lines. I’m not saying remove those (maybe some like them), but you need a chart I can quickly see who had the best min/avg.
The current way you show the benchmarks is basically a big mess IMHO. I’m not talking about frametimes here, you clearly need lines for that to show. I’m talking the actual fps charts, those need to be done in bar graphs like most other sites do. They are very quick to read and get a quick picture of who is leading X game. You lose hits from people like me every review. Sure I can decipher all the lines, but I don’t have 10 minutes to do it for each chart. Instead I read the first page, the temps/noise page and the conclusion page. The charts are all just a PITA to me 🙁
Don’t get me wrong, I love the site, and when I have time I come back at some point (usually) to read more. It’s just easier to go elsewhere even if I don’t really want to.
I don’t think you understand
I don’t think you understand the graphics for FPS, which are pretty easy to see. It doesn’t show a number, but a chart of where the FPS was at each point in the 60 second benchmark. If you look at the lowest point in the graph, that is the minimum, if you look at the highest point, that is the maximum. This method gives you a clearer picture of what actually is taking place. Especially with minimums. In the bar graph charts, you don’t know if the minimum is a small drop at the load of a game, or if the game gets near that point often. This method makes it clear.
Can you please tell me the
Can you please tell me the exact settings used in your heaven benchmark.
The extreme does not match up with results I’ve seen if thats 1920×1080 w/ 8xAA
I just ordered the 780 and im
I just ordered the 780 and im kind of geeking out waiting for it. Lol just kidding my question is everyone says this card is not good for a single 1920×1080 monitor but what about people like me who have the asus 144hz monitor and want to play games at 120fps or play games in 3d? Wouldnt this card be perfect for that?
Yeah I am with you, currently
Yeah I am with you, currently ordering a 780 but debating whether to get a high res monitor with 60fps or a 1080p with 120, its the last thing I need to complete my build.
Is it faster than all but
Is it faster than all but Titan? Yes. Does it overclock well? Yes, easily going beyond Titan in many games, according to Hardocp. Does it use less power than 7970GE? Is it quieter? Yes. Is it 35% less than Titan? Yes.
Should it be 60 dollars less? Yes, and it will be soon. DO most high end buyers give a %@$#$% about 60 bucks? I doubt it.
I don’t think you use a
I don’t think you use a particularly good benchmark run for Skyrim. It takes place primarily in Whiterun, which is more CPU bound in my experience. A better choice for GPU benchmarking would be the forest areas either around Riften or, preferably, Falkreath. These are the most GPU bound areas of the game.