Specifications
The new GTX 980 Ti is here and as we found out, can match the performance of the TITAN X for a much lower price of $650!
When NVIDIA launched the GeForce GTX Titan X card only back in March of this year, I knew immediately that the GTX 980 Ti would be close behind. The Titan X was so different from the GTX 980 when it came to pricing and memory capacity (12GB, really??) that NVIDIA had set up the perfect gap with which to place the newly minted GTX 980 Ti. Today we get to take the wraps off of that new graphics card and I think you'll be impressed with what you find, especially when you compare its value to the Titan X.
Based on the same Maxwell architecture and GM200 GPU, with some minor changes to GPU core count, memory size and boost speeds, the GTX 980 Ti finds itself in a unique spot in the GeForce lineup. Performance-wise it's basically identical in real-world game testing to the GTX Titan X, yet is priced $350 less that that 12GB behemoth. Couple that with a modest $50 price drop in the GTX 980 cards and you have all markers of an enthusiast graphics card that will sell as well as any we have seen in recent generations.
The devil is in all the other details, of course. AMD has its own plans for this summer but the Radeon R9 290X is still sitting there at a measly $320, undercutting the GTX 980 Ti by more than half. NVIDIA seems to be pricing its own GPUs as if it isn't even concerned with what AMD and the Radeon brand are doing. That could be dangerous if it goes on too long, but for today, can the R9 290X put up enough fight with the aging Hawaii XT GPU to make its value case to gamers on the fence?
Will the GeForce GTX 980 Ti be the next high-end GPU to make a splash in the market, or will it make a thud at the bottom of the GPU gene pool? Let's dive into it, shall we?
GTX 980 Ti Specifications
If my introduction didn't give the whole story away, this table likely will. The GeForce GTX 980 Ti looks almost indistinguishable from the GTX Titan X anywhere that matters.
GTX 980 Ti | TITAN X | GTX 980 | TITAN Black | R9 290X | |
---|---|---|---|---|---|
GPU | GM200 | GM200 | GM204 | GK110 | Hawaii XT |
GPU Cores | 2816 | 3072 | 2048 | 2880 | 2816 |
Rated Clock | 1000 MHz | 1000 MHz | 1126 MHz | 889 MHz | 1000 MHz |
Texture Units | 176 | 192 | 128 | 240 | 176 |
ROP Units | 96 | 96 | 64 | 48 | 64 |
Memory | 6GB | 12GB | 4GB | 6GB | 4GB |
Memory Clock | 7000 MHz | 7000 MHz | 7000 MHz | 7000 MHz | 5000 MHz |
Memory Interface | 384-bit | 384-bit | 256-bit | 384-bit | 512-bit |
Memory Bandwidth | 336 GB/s | 336 GB/s | 224 GB/s | 336 GB/s | 320 GB/s |
TDP | 250 watts | 250 watts | 165 watts | 250 watts | 290 watts |
Peak Compute | 5.63 TFLOPS | 6.14 TFLOPS | 4.61 TFLOPS | 5.1 TFLOPS | 5.63 TFLOPS |
Transistor Count | 8.0B | 8.0B | 5.2B | 7.1B | 6.2B |
Process Tech | 28nm | 28nm | 28nm | 28nm | 28nm |
MSRP (current) | $649 | $999 | $499 | $999 | $329 |
Let's first start at the GPU count where the GTX 980 Ti hits 2,816 CUDA cores, 256 fewer than the GTX Titan X, a drop of about 9%. Along with that goes a drop of texture units to 176 from 192. The ROP count and 384-bit memory bus remain the identical though, providing the same amount of memory bandwidth as the $999 flagship card offering from NVIDIA.
Block Diagram of the GM200 inside the GTX 980 Ti - Two SMMs disabled
You will notice a drop from 12GB of memory to 6GB for this card, which is not surprising. In all honesty, for gamers, the 12GB of memory was always fools gold, never affecting game play even in the most extreme configurations; 4K Surround (triple monitors) would likely be unable to utilize it. At 6GB, the GTX 980 Ti becomes the highest default memory capacity for any reference card other than the Titan X. For even the most demanding games, at the most demanding resolutions, 6GB of memory will be just fine. Our test settings at 4K for Grand Theft Auto V, for example, are estimated to use about 4.9GB; and that's the highest we use for any current title. And with all the rumors circulating about AMD's Fiji being stuck at 4GB because of the HBM implementation, expect NVIDIA to tout and advertise that advantage in any way it can.
Even though the rated base clock of the GTX 980 Ti is the same as the Titan X at 1000 MHz, in my experience with the two graphics cards the 980 Ti does seem to hit 40-70 MHz higher Boost clocks over extended gaming periods. As you'll soon see in our benchmarking, this is likely the biggest reason that despite the supposed disadvantage in CUDA core count, the GTX 980 Ti and the Titan X are basically interchangeable in gaming capability. I also suspect that some changes to the GPU power delivery help that a bit, as indicated by a 92C (versus 91C) rated thermal limit for the GTX 980 Ti.
In terms of TDP, the GTX 980 Ti is rated at 250 watts, the same as the GTX Titan X and 40 watts lower than the R9 290X. In reality, through our direct graphics card power consumption testing, the GeForce cards err on the side of caution while the AMD Radeon R9 series swings the other direction, often running hotter than the TDPs would indicate. The GTX 980 Ti does use the exact same cooler design and even the same fan curve implementation (according to NVIDIA) so sound levels and cooling capability should be identical, yet again, to the Titan X.
What about memory
What about memory segmentation? Does 980Ti have memory split into faster and slower pool just like 970 with its 3.5+0.5 gigs?
there is no problem, all 6gb
there is no problem, all 6gb are accessible at same speed. Kinda stupid people would think that would happen 2 times in a row without being told.
Saying that the need for more
Saying that the need for more than 6GB of VRAM is a ways off, when GTA V uses 5 seems kind of optimistic.
Also, I didn’t think the 295×2 performed that well. Sure, it’s a power hog and multi GPU setups have… spotty performance, but look at those averages!
I suspect that, going
I suspect that, going forward, the memory consumption will actually go down, not up. There are a lot of new features that could allow much more efficient management of resources. The 295×2 does quite well on my opinion. Right at the moment, I would say it is still the best bang for your buck in that price range, unless you play an older game or one that has not been well optimized for multiple GPUs yet. Anyway, who would buy either a 295×2 or a 980 Ti right now with AMDs new part comming soon?
I suspect that, going
I suspect that, going forward, the memory consumption will actually go down, not up. There are a lot of new features that could allow much more efficient management of resources. The 295×2 does quite well on my opinion. Right at the moment, I would say it is still the best bang for your buck in that price range, unless you play an older game or one that has not been well optimized for multiple GPUs yet. Anyway, who would buy either a 295×2 or a 980 Ti right now with AMDs new part comming soon?
‘going forward, memory
‘going forward, memory consumption will go down’
As I said. It seems optimistic.
DX12 is set up to use a
DX12 is set up to use a larger number of smaller draw calls, which should reduce the amount of memory needed at any single instant. Also, a lot is being done with tiling and compression to avoid loading unnecessary resources. This also ties into GPUs making use of virtual memory and unified memory which allows the gpu to load more precisely what is needed (the actual working set).
As AMD has stated, current implementations are very wasteful on memory. There may be some situations where the 4 GB on AMDs upcomming parts is a bottleneck, but it is a trade-off they had to make. The 980 Ti’s performance is almost identical to the Titan X, even with less hardware, but the same bandwidth. This implies that it is bandwidth limited. Fuji has even more powerful hardware, so it would be held back severely by GDDR5. AMD has probably done a lot of work to decrease memory usage, but there may be some current games which will push the limits at extreme settings. I don’t think this will be much of a problem though.
Do you think the 980 Ti would
Do you think the 980 Ti would have had as good performance as it does, if the 390X from AMD wasn’t just around the corner?
I think the max specks of
I think the max specks of this architecture (this card) were worked out a long time before the chips were even fabricated, BUT i am super suprised they didn;t wait till AFTER the new AMD flagship. Makes me worry that they(N) already know it(A) can’t compete.
I am wondering if AMD will
I am wondering if AMD will manage to launch more than one HBM part. I would expect a full part and at least one salvaged part with cut down units and/or memory interface. It is possible that the cut down part would compete with the 980 Ti and the full part will be higher priced.
I have to wonder if they had
I have to wonder if they had originally planned on a narrower memory interface or something. The fact that they are essentially selling a Titan X for a much cheaper price seems to imply that AMD may have an incredible product. This release sounds like Nvidia trying to move as many as possible before Fuji comes out. Contrary to the “I’m going to buy this now” trolls, it would be pretty stupid to buy now with a major AMD launch so soon.
I feel like Titan X has been
I feel like Titan X has been used very effectively as premium decoy pricing. Don’t get me wrong I’m not anti-Nvidia and I will probably pick-up this card myself, but comparing this card value wise to Titan X seems to be falling into the trap a little. To me Titan X is just an insane reference point.
I still don’t understand how
I still don’t understand how NVIDIA can sell the flawed 970 at that price. They should’ve replaced it with 980 and put the Ti above that. They owe us that for mocking up the 970.
Um, its a card that MATCHES
Um, its a card that MATCHES AMD’s 290x and sells for same price. And if it wasn’t for gtx970’s price 290x would sell for 450-500$ right now. There is nothing wrong with the 970. just cause specs were revised doesn’t mean it magically a slower card.
Didn’t you read the PCPer
Didn’t you read the PCPer review. It has issues on 1 out of the 2 games it tested. At 1440p even SLI had issues.
You mean the one test where
You mean the one test where they used out of the ordinary resolution scaling to push the card to the absolute limit (with acceptable results)? Or the test of another game that had numerous issues with VRAM memory scaling on various Nvidia and AMD cards that required a .cfg edit to fix (that they likely never knew about)?
nice review. I have a
nice review. I have a question.
I have 4790k @ 4,6ghz 2×970 SLI @ 1530/7800, 16GB framework and 1440p monitor.
Do you think it is worthwhile to change the 2×970 on 980Ti?
maybe you could extend the review of 2×970. it would be a good comparison especially since the price is the same ceiling.
This is nvidia way of saying
This is nvidia way of saying thanks for the cash Titan X owners. Ouch
ROFL! Couldn’t have said it
ROFL! Couldn’t have said it better myself!
AMD once again demonstrates
AMD once again demonstrates that it is best at sucking down the power grid and contributing to Global Warming.
Is it REALLY 6GB this time,
Is it REALLY 6GB this time, or is there some funny business going on like with the 970???
sad how amd fans keep going
sad how amd fans keep going back to that any chance they get. Like they expect for it to happen again.
Sadder is that you do the
Sadder is that you do the exact same thing but somehow don’t find it sad.
double ;/
double ;/
Hey Ryan,
I didn’t see any
Hey Ryan,
I didn’t see any mention of Mantle on the Battlefield 4 page of the review. Did you get a chance to test it? I’m curious if it would have given the Radeons a performance boost. Especially ye olde 290X.
Mantle is dead. AMD killed
Mantle is dead. AMD killed off developing it. No reason to show results in it anymore.
I hope in your next test
I hope in your next test setup you should replace Grid 2, Skyrim, Bioshock Infinite with the Witcher 3, the upcoming Arkham Knight, and Project Cars. Those are the games that will make high end gpu really sweat. Also, please add another graph that shows the average frame rate and minimum frame rate. It makes it much easier to read than a bunch of squiggly lines.
Reason you use older game,
Reason you use older game, likely hood game will get update that changes performance are almost nothing. So keeps a level playing feed when testing new cards.
I have the intention to
I have the intention to change my gpu. After carefully reading your letter , I find illogical buy nvidia nvidia 980 or 980ti .
The R9 290x is 300usd less and practically has the same performance even more than one teraflop than nvidia 980
I have the intention to
I have the intention to change my gpu. After carefully reading your letter , I find illogical buy nvidia nvidia 980 or 980ti .
The R9 290x is 300usd less and practically has the same performance even more than one teraflop than nvidia 980
Why would someone use MSAA x4
Why would someone use MSAA x4 at 4k res in GTA 5? It’s a fps killer, vram hog and at that res it is NOT needed.
You can get a high framerate at 4k with SLI 980s rather than just the Ti and Titan X. I know this since people do it because they don’t waste frames on msaa.
The advanced graphics stuff is also excessive and just a waste of fps.
Hi Guys,
I have a Sli system
Hi Guys,
I have a Sli system with GTX 670+FTW 4GB I will big difference in performance if I replace the SLI with the 980ti?
Also I read the specs of the GTX 670+FTW anf the resolution in digital says is 2560×1600 and I actually ran games in 4K resolution is that because of the SLI configuration I have?
Thank you in Advance!
Sorry for the first line I
Sorry for the first line I want to say:
I have a Sli system with GTX 670+FTW 4GB I will notice a big difference in performance if I replace the SLI configuration with a single 980ti?
The GTX 690 is a great card
The GTX 690 is a great card for those who bought it in its’ day. Regarding the frame buffer and the aging tech, how is this card stacking up to these benchmarks? what are the trade off blows (“it’s better in this fashion here but not on here”)?
Since the 980ti has two
Since the 980ti has two compute units disabled does it have the same memory nuance as the 970 ?
Nice review, It is really
Nice review, It is really funny, the fact that few months ago everyone was yelling at me for buying a GPU with 4GB of memory, everyone was saying “who needs 4gb of memory” .
8GB of graphics memory should be already the normal , 2160p really needs much more than 4GB so 8GB should be the normal. Also hopefully games will start loading all textures for each frame at once! instead of using the ugly texture streaming method that most games are doing because of the 3GB limit on most cards (thanks Nvidia).
Good review, as a 680 owner I
Good review, as a 680 owner I quite appreciate the comparison to my particular card. Please include a poor shirty ancient 680 in 18 months when the gtx 1080 or whatever they name it is released.