Flagship Performance Gets Cheaper
NVIDIA has lowered the cost of entry for GP102 graphics performance for gaming. What does $699 get you today?
UPDATE! If you missed our launch day live stream, you can find the replay below:
It’s a very interesting time in the world of PC gaming hardware. We just saw the release of AMD’s Ryzen processor platform that shook up the processor market for the first time in a decade, AMD’s Vega architecture has been given the brand name “Vega”, and the anticipation for the first high-end competitive part from AMD since Hawaii grows as well. AMD was seemingly able to take advantage of Intel’s slow innovation pace on the processor and it was hoping to do the same to NVIDIA on the GPU. NVIDIA’s product line has been dominant in the mid and high-end gaming market since the 900-series with the 10-series products further cementing the lead.
The most recent high end graphics card release came in the form of the updated Titan X based on the Pascal architecture. That was WAY back in August of 2016 – a full seven months ago! Since then we have seen very little change at the top end of the product lines and what little change we did see came from board vendors adding in technology and variation on the GTX 10-series.
Today we see the release of the new GeForce GTX 1080 Ti, a card that offers only a handful of noteworthy technological changes but instead is able to shake up the market by instigating pricing adjustments to make the performance offers more appealing, and lowering the price of everything else.
The GTX 1080 Ti GP102 GPU
I already wrote about the specifications of the GPU in the GTX 1080 Ti when it was announced last week, so here’s a simple recap.
GTX 1080 Ti | Titan X (Pascal) | GTX 1080 | GTX 980 Ti | TITAN X | GTX 980 | R9 Fury X | R9 Fury | R9 Nano | |
---|---|---|---|---|---|---|---|---|---|
GPU | GP102 | GP102 | GP104 | GM200 | GM200 | GM204 | Fiji XT | Fiji Pro | Fiji XT |
GPU Cores | 3584 | 3584 | 2560 | 2816 | 3072 | 2048 | 4096 | 3584 | 4096 |
Base Clock | 1480 MHz | 1417 MHz | 1607 MHz | 1000 MHz | 1000 MHz | 1126 MHz | 1050 MHz | 1000 MHz | up to 1000 MHz |
Boost Clock | 1582 MHz | 1480 MHz | 1733 MHz | 1076 MHz | 1089 MHz | 1216 MHz | – | – | – |
Texture Units | 224 | 224 | 160 | 176 | 192 | 128 | 256 | 224 | 256 |
ROP Units | 88 | 96 | 64 | 96 | 96 | 64 | 64 | 64 | 64 |
Memory | 11GB | 12GB | 8GB | 6GB | 12GB | 4GB | 4GB | 4GB | 4GB |
Memory Clock | 11000 MHz | 10000 MHz | 10000 MHz | 7000 MHz | 7000 MHz | 7000 MHz | 500 MHz | 500 MHz | 500 MHz |
Memory Interface | 352-bit | 384-bit G5X | 256-bit G5X | 384-bit | 384-bit | 256-bit | 4096-bit (HBM) | 4096-bit (HBM) | 4096-bit (HBM) |
Memory Bandwidth | 484 GB/s | 480 GB/s | 320 GB/s | 336 GB/s | 336 GB/s | 224 GB/s | 512 GB/s | 512 GB/s | 512 GB/s |
TDP | 250 watts | 250 watts | 180 watts | 250 watts | 250 watts | 165 watts | 275 watts | 275 watts | 175 watts |
Peak Compute | 10.6 TFLOPS | 10.1 TFLOPS | 8.2 TFLOPS | 5.63 TFLOPS | 6.14 TFLOPS | 4.61 TFLOPS | 8.60 TFLOPS | 7.20 TFLOPS | 8.19 TFLOPS |
Transistor Count | 12.0B | 12.0B | 7.2B | 8.0B | 8.0B | 5.2B | 8.9B | 8.9B | 8.9B |
Process Tech | 16nm | 16nm | 16nm | 28nm | 28nm | 28nm | 28nm | 28nm | 28nm |
MSRP (current) | $699 | $1,200 | $599 | $649 | $999 | $499 | $649 | $549 | $499 |
The GTX 1080 Ti looks a whole lot like the TITAN X launched in August of last year. Based on the 12B transistor GP102 chip, the new GTX 1080 Ti will have 3,584 CUDA core with a 1.60 GHz Boost clock. That gives it the same processor count as Titan X but with a slightly higher clock speed which should make the new GTX 1080 Ti slightly faster by at least a few percentage points and has a 4.7% edge in base clock compute capability. It has 28 SMs, 28 geometry units, 224 texture units.
Interestingly, the memory system on the GTX 1080 Ti gets adjusted – NVIDIA has disabled a single 32-bit memory controller to give the card a total of 352-bit wide bus and an odd-sounding 11GB memory capacity. The ROP count also drops to 88 units. Speaking of 11, the memory clock on the G5X implementation on GTX 1080 Ti will now run at 11 Gbps, a boost available to NVIDIA thanks to a chip revision from Micron and improvements to equalization and reverse signal distortion.
The move from 12GB of memory on the GP102-based Titan X to 11GB on the GTX 1080 Ti is an interesting move, and evokes memories of the GTX 970 fiasco where NVIDIA disabled a portion of that memory controller but left the memory that would have resided on it ON the board. At that point, what behaved as 3.5GB of memory at one speed and 500 MB at another speed, was the wrong move to make. But releasing the GTX 970 with "3.5GB" of memory would have seemed odd too. NVIDIA is not making the same mistake, instead building the GTX 1080 Ti with 11GB out the gate.
NVIDIA spent time at the tech day detailing its tiled caching rendering method (that it has been using without disclosure since the 900-series launch) in an attempt to demonstrate the ability of GDDR5X at high speeds with high compression. Part of the reasoning for finally divulging this information is to counter the idea that HBM2 will have a fundamental advantage over G5/G5X technologies. With Vega coming mid-year and highly touting the inclusion of HBM2 and its HBCC, NVIDIA is getting ahead of the curve on messaging.
The TDP of the new part is 250 watts
,identical to the Titan product. The cooler has been improved compared to the GTX 1080, offering quieter fan speeds and lower temperatures when operating at the same power envelope.Performance estimates from NVIDIA put the GTX 1080 Ti about 35% faster than the GTX 1080, the largest “kicker performance increase” that we have seen from a flagship Ti launch.
Pricing is going to be set at $699 so don't expect to find this in any budget builds. But for the top performing GeForce card on the market, that's what we expect. Though we don't want to belittle a cost like this, the truth is NVIDIA undercut what many in the industry expected this card would cost in an attempt to prepare for the battle with Vega later in the summer. The question that will need to be answered by AMD is whether its new product line will be able to match or beat the best that NVIDIA puts forward.
The GeForce GTX 1080 Ti Graphics Card
To the surprise of no one, the primary design of the GTX 1080 Ti shares a lot with the GTX 1080 and the Titan X cards launch last year in both style, function and cooling capability. The Founders Edition, as NVIDIA continues to call its reference/early card builds, remains in place but with a very important change: the base MSRP of the card and the Founders Edition price will be the same. This means that complaints we had with the 10-series launch and the debut of the Founders Edition system and pricing should be dissuaded.
From a cooling and power design perspective, the GTX 1080 Ti shares more in common with the latest Titan X than with the GTX 1080. This includes a larger vapor chamber for improved cooling performance at the same or lower noise levels and a power design with 7-phases of 2x dual-FET components capable of 250 amps of power.
The back plate and base plate on the card are functional for cooling as well as for structural integrity for system that are moved after product installation.
This does mark the first time we have seen a consumer GeForce graphics card from NVIDIA be released without a DVI connection. NVIDIA decided instead to utilize that space for additional air flow out of the chassis from the radial fan on the blower design. As a frequent user of the DVI connection for our use and testing internally, that’s a letdown, but NVIDIA did include a DisplayPort to DVI adapter in the box to help alleviate any issues. (It’s a single link adapter, limited to 1920×1200…)
Power connectivity requires an 8-pin and a 6-pin from the power supply to reach and extend past the 250 watt product TDP.
Small remark on 11 page
Small remark on 11 page (Detailed Power Consumption Testing):
Last Light testing at 4K with the Titan X running at a +150 MHz offset (should be 1080 Ti, instead of Titan X).
Other than that – thanks for review.
Saw that too, copy pasta!
Saw that too, copy pasta!
interesting power draw from
interesting power draw from 6-pin & any slow down once we hit peak of 11 GB VRAM?
weird no DOOM, For Honor, Sniper Elite 4 etc benchmark?
Doom isn’t much of a stress
Doom isn’t much of a stress test for anything, I think thats been demonstrated very well by many testing outlets. For Honor and Sniper Elite 4 are pretty damn new, hard to castigate them for not using them
Actually running with Vulcan
Actually running with Vulcan is good for testing CPU OC stability.
2% faster then GTX1080 in
2% faster then GTX1080 in Grand Theft Auto V and Gears of War at 1440p.
Woooooww!
Those two titles are heavily
Those two titles are heavily CPU dependent.
Gears of War isn’t
Gears of War isn’t
You know it’s not GOW4 right?
You know it’s not GOW4 right?
Unreal engine isn’t cpu
Unreal engine isn’t cpu dependant
IB4 Ryzen testing requests.
IB4 Ryzen testing requests.
Thanks for the 1400p 980ti
Thanks for the 1400p 980ti benchmark comparisons for those of us thinking of upgrading from that gpu. Looks like a pretty compelling upgrade.
My one question is the reference PCB layout. Is it identical to Titan X pascal? I like to watercool my GPUs to get the most out of them
yes, it is the same PCB, see
yes, it is the same PCB, see the pictures here
250 amps? I’d like to see
250 amps? I’d like to see that power supply!
At 3V, 5V, 12V… to name a
At 3V, 5V, 12V… to name a few! 🙂
Hopefully, it is at 1V or
Hopefully, it is at 1V or below:) It was mentioned in the launch live stream too, there must be a mix up somewhere.
“the NVIDIA GeForce GTX 1080
“the NVIDIA GeForce GTX 1080 Ti is going to be selling for $699 starting today from both NVIDIA and it’s partners”
But.. but.. it’s not selling today anywhere..
Yeah, looks like NDA date was
Yeah, looks like NDA date was today, actual launch is tomorrow.
Any word on the 980ti step up
Any word on the 980ti step up to 1080ti?
A bigger chip off the GP 102
A bigger chip off the GP 102 block, so more performance through more resources. It will be intresting to see what AMD’s Vega will offer in performance with all of Vega’s new features that are new to the Vega micro-arch relative to the Ti’s more amounts of the same with some slightly higher clocks. that $699 price point will not be too difficult for AMD to beat and Vega will have that NCU, HBCC, and other all new features to take into account. So it’s the Pascal refresh versus Vega’s new features for AMD’s latest Flagship offering when Vega is released 2Q of 2017.
AMD does have its more of the same in the mainstream also with the RX 500 series Polaris refresh offerings in April but then Vega will be following. So hopefully by the summer Vega will be here.
I hope that the games are being tuned for Vega in advance unlike some of the games that were/are still not tuned for Ryzen or Ryzen’s extra cores for the affordable prices.
You realize that by the time
You realize that by the time Vega is out (may), we will be looking at preliminary Volta information, right?
AMD missed this generation entirely on the high-end.
Good more Volta news to force
Good more Volta news to force AMD to get its Navi to market on time! But with Zen/Ryzen and Zen/Naples making some mad revenues for AMD there will be no more excuses for AMD to be late with any flagship SKUs! So the competition moves from the two Ps in the mainstream market to the two Vs in the Flagship market. AMD does not have a flagship P SKU but some dual RX 480 prices deals will be had by some when the RX 500 series P refresh get here for AMD’s mainstream P SKUs along with that V flagship from AMD shortly after.
That Ps and Vs competition will lead to some N and whatever comes after competition as always.
Vega is coming to desktop in
Vega is coming to desktop in the next few months. Desktop Volta is now 1H next year. So at minimum it will be *at least* six months behind Vega, and possibly as much as a year.
Also, AMD will be launching Navi next year. So it won’t be Vega that Volta finds itself up against.
Hate to break it to you, but it’s AMD who are going to leap frog ‘Paxwell’ with Vega this year, and unless Volta is something really special, Navi will keep them ahead.
I’m not really sure how you managed to get things *completely* the wrong way round, but trust me, you have.
march 2017. amd gives out
march 2017. amd gives out “vega” tshirts. nvidia releases a new enthusiast card. and drops the price on another. #gofigure
you have an error in the
you have an error in the summary, “Whether or not they can deliver on those promises has yet to be proven.”, should say ‘deliver on any promises has never been proven’.
Still very upset about Rysen.
PC Master Race, lets see those breath of wild & horizon zero dawn benches. LOL reviews 2 week’s before launch suckas
You have a knack for
You have a knack for scrutiny. Prepositions aren’t sentunzez. Phreiz is 4 needing, dog cat barf, pooped em’ XD
http://www.whatinthehellisthislinkandwhyisitsolongandwhydidyouactuallyspendthetimetoreadallofitXD.biz
Seems there is a typo in OC
Seems there is a typo in OC page:
‘NVIDIA is slightly aggressive with target clock speeds and voltages ath the rated power targets, and that results in the variance that you see here.’
I think you should do an
I think you should do an article on just how much of a lie TDP figures can be.
250W part? Might spike to 20% higher than that rating.
I think this is something that has real consequences. For supercomputers and other systems that have to last and be run at full load with minimal amounts of failures, this is unacceptable because it shortens the bathtub curve somewhat unpredictably.
Consumer GPUs with boost features are made to squeeze every last bit of performance out of them, rather than last long and be reliable. Hopefully theboard makers compensate, and hopefully their EDA, electromigration simulations and emulation when designing the chips treat the TDP as a complete lie as well.
Power virus games with shitty ScaleformUI menus could very well have a gpu like this sitting above its TDP long enough to probably cause some electromigration. Its really no wonder that larger GPU VRMs are a common failure.
I think the key to TDP is
I think the key to TDP is that it is THERMAL DESIGN POWER. Not Total Input Power. As long as the short term average power stays below the rated TDP, then substantial brief spikes above the TDP are ok since the thermal cooling capability is buffered by the thermal mass and dissipation power of the cooler.
Why not Doom test?
Why not Doom test?
awesome
awesome
108% faster than Fury’s in
108% faster than Fury’s in AMD sponsored Hitman 4k. Wow!!!!!
37% better than 1080 in GTA5 4k.
26% faster than 1080 Gears of War 4k.
Performance in 1440 may be the result of both ti and 1080 hitting engine limits not that ti is only 2% faster.
This card is designed for 4k isn’t this what AMD users always bragging on.
Anyways I’m looking for a great 4k card and this looks like a winner.
Good review PCPer.
great review.
great review.
WHY in your testing results
WHY in your testing results do you not show any results for multiple monitors. With all the connectors on the GPU, Why don’t you do any testing using them ?
FCAT incompatibility
FCAT incompatibility unfortunately. We are working on it with vendors, so hopefully we will be able to do this soon as we do want to provide that info in reviews.
I am very envious, looks
I am very envious, looks amazing
Seems to be the best card
Seems to be the best card ATM! For quality and price, gtx 1080 ti FTW!