Flagship Performance Gets Cheaper
NVIDIA has lowered the cost of entry for GP102 graphics performance for gaming. What does $699 get you today?
UPDATE! If you missed our launch day live stream, you can find the replay below:
It’s a very interesting time in the world of PC gaming hardware. We just saw the release of AMD’s Ryzen processor platform that shook up the processor market for the first time in a decade, AMD’s Vega architecture has been given the brand name “Vega”, and the anticipation for the first high-end competitive part from AMD since Hawaii grows as well. AMD was seemingly able to take advantage of Intel’s slow innovation pace on the processor and it was hoping to do the same to NVIDIA on the GPU. NVIDIA’s product line has been dominant in the mid and high-end gaming market since the 900-series with the 10-series products further cementing the lead.
The most recent high end graphics card release came in the form of the updated Titan X based on the Pascal architecture. That was WAY back in August of 2016 – a full seven months ago! Since then we have seen very little change at the top end of the product lines and what little change we did see came from board vendors adding in technology and variation on the GTX 10-series.
Today we see the release of the new GeForce GTX 1080 Ti, a card that offers only a handful of noteworthy technological changes but instead is able to shake up the market by instigating pricing adjustments to make the performance offers more appealing, and lowering the price of everything else.
The GTX 1080 Ti GP102 GPU
I already wrote about the specifications of the GPU in the GTX 1080 Ti when it was announced last week, so here’s a simple recap.
GTX 1080 Ti | Titan X (Pascal) | GTX 1080 | GTX 980 Ti | TITAN X | GTX 980 | R9 Fury X | R9 Fury | R9 Nano | |
---|---|---|---|---|---|---|---|---|---|
GPU | GP102 | GP102 | GP104 | GM200 | GM200 | GM204 | Fiji XT | Fiji Pro | Fiji XT |
GPU Cores | 3584 | 3584 | 2560 | 2816 | 3072 | 2048 | 4096 | 3584 | 4096 |
Base Clock | 1480 MHz | 1417 MHz | 1607 MHz | 1000 MHz | 1000 MHz | 1126 MHz | 1050 MHz | 1000 MHz | up to 1000 MHz |
Boost Clock | 1582 MHz | 1480 MHz | 1733 MHz | 1076 MHz | 1089 MHz | 1216 MHz | – | – | – |
Texture Units | 224 | 224 | 160 | 176 | 192 | 128 | 256 | 224 | 256 |
ROP Units | 88 | 96 | 64 | 96 | 96 | 64 | 64 | 64 | 64 |
Memory | 11GB | 12GB | 8GB | 6GB | 12GB | 4GB | 4GB | 4GB | 4GB |
Memory Clock | 11000 MHz | 10000 MHz | 10000 MHz | 7000 MHz | 7000 MHz | 7000 MHz | 500 MHz | 500 MHz | 500 MHz |
Memory Interface | 352-bit | 384-bit G5X | 256-bit G5X | 384-bit | 384-bit | 256-bit | 4096-bit (HBM) | 4096-bit (HBM) | 4096-bit (HBM) |
Memory Bandwidth | 484 GB/s | 480 GB/s | 320 GB/s | 336 GB/s | 336 GB/s | 224 GB/s | 512 GB/s | 512 GB/s | 512 GB/s |
TDP | 250 watts | 250 watts | 180 watts | 250 watts | 250 watts | 165 watts | 275 watts | 275 watts | 175 watts |
Peak Compute | 10.6 TFLOPS | 10.1 TFLOPS | 8.2 TFLOPS | 5.63 TFLOPS | 6.14 TFLOPS | 4.61 TFLOPS | 8.60 TFLOPS | 7.20 TFLOPS | 8.19 TFLOPS |
Transistor Count | 12.0B | 12.0B | 7.2B | 8.0B | 8.0B | 5.2B | 8.9B | 8.9B | 8.9B |
Process Tech | 16nm | 16nm | 16nm | 28nm | 28nm | 28nm | 28nm | 28nm | 28nm |
MSRP (current) | $699 | $1,200 | $599 | $649 | $999 | $499 | $649 | $549 | $499 |
The GTX 1080 Ti looks a whole lot like the TITAN X launched in August of last year. Based on the 12B transistor GP102 chip, the new GTX 1080 Ti will have 3,584 CUDA core with a 1.60 GHz Boost clock. That gives it the same processor count as Titan X but with a slightly higher clock speed which should make the new GTX 1080 Ti slightly faster by at least a few percentage points and has a 4.7% edge in base clock compute capability. It has 28 SMs, 28 geometry units, 224 texture units.
Interestingly, the memory system on the GTX 1080 Ti gets adjusted – NVIDIA has disabled a single 32-bit memory controller to give the card a total of 352-bit wide bus and an odd-sounding 11GB memory capacity. The ROP count also drops to 88 units. Speaking of 11, the memory clock on the G5X implementation on GTX 1080 Ti will now run at 11 Gbps, a boost available to NVIDIA thanks to a chip revision from Micron and improvements to equalization and reverse signal distortion.
The move from 12GB of memory on the GP102-based Titan X to 11GB on the GTX 1080 Ti is an interesting move, and evokes memories of the GTX 970 fiasco where NVIDIA disabled a portion of that memory controller but left the memory that would have resided on it ON the board. At that point, what behaved as 3.5GB of memory at one speed and 500 MB at another speed, was the wrong move to make. But releasing the GTX 970 with "3.5GB" of memory would have seemed odd too. NVIDIA is not making the same mistake, instead building the GTX 1080 Ti with 11GB out the gate.
NVIDIA spent time at the tech day detailing its tiled caching rendering method (that it has been using without disclosure since the 900-series launch) in an attempt to demonstrate the ability of GDDR5X at high speeds with high compression. Part of the reasoning for finally divulging this information is to counter the idea that HBM2 will have a fundamental advantage over G5/G5X technologies. With Vega coming mid-year and highly touting the inclusion of HBM2 and its HBCC, NVIDIA is getting ahead of the curve on messaging.
The TDP of the new part is 250 watts
,identical to the Titan product. The cooler has been improved compared to the GTX 1080, offering quieter fan speeds and lower temperatures when operating at the same power envelope.Performance estimates from NVIDIA put the GTX 1080 Ti about 35% faster than the GTX 1080, the largest “kicker performance increase” that we have seen from a flagship Ti launch.
Pricing is going to be set at $699 so don't expect to find this in any budget builds. But for the top performing GeForce card on the market, that's what we expect. Though we don't want to belittle a cost like this, the truth is NVIDIA undercut what many in the industry expected this card would cost in an attempt to prepare for the battle with Vega later in the summer. The question that will need to be answered by AMD is whether its new product line will be able to match or beat the best that NVIDIA puts forward.
The GeForce GTX 1080 Ti Graphics Card
To the surprise of no one, the primary design of the GTX 1080 Ti shares a lot with the GTX 1080 and the Titan X cards launch last year in both style, function and cooling capability. The Founders Edition, as NVIDIA continues to call its reference/early card builds, remains in place but with a very important change: the base MSRP of the card and the Founders Edition price will be the same. This means that complaints we had with the 10-series launch and the debut of the Founders Edition system and pricing should be dissuaded.
From a cooling and power design perspective, the GTX 1080 Ti shares more in common with the latest Titan X than with the GTX 1080. This includes a larger vapor chamber for improved cooling performance at the same or lower noise levels and a power design with 7-phases of 2x dual-FET components capable of 250 amps of power.
The back plate and base plate on the card are functional for cooling as well as for structural integrity for system that are moved after product installation.
This does mark the first time we have seen a consumer GeForce graphics card from NVIDIA be released without a DVI connection. NVIDIA decided instead to utilize that space for additional air flow out of the chassis from the radial fan on the blower design. As a frequent user of the DVI connection for our use and testing internally, that’s a letdown, but NVIDIA did include a DisplayPort to DVI adapter in the box to help alleviate any issues. (It’s a single link adapter, limited to 1920×1200…)
Power connectivity requires an 8-pin and a 6-pin from the power supply to reach and extend past the 250 watt product TDP.
That power draw is much
That power draw is much smarter when it come to repartition than the mess that was stock RX480 at launch. Still, I’d wait for a dual 8pin version from OEM!
I’m curious how you measure
I’m curious how you measure power through PCI-E. Measuring voltage is easy enough, do you use a precision resistor and op-amp to measure the current?
Hey Ryan, just wondering if
Hey Ryan, just wondering if you could share the exact settings you used for the unigine heaven bench? It says 1440p Ultra, but what about tessellation (I imagine that’s maxed out, of course) or more importantly, antialiasing? Only wondering because compared to your 980 score, mine is a few fps lower with 8xAA, though my card is heavily OC’d and scores significantly higher in 3DMark. Just trying to get a perfect-as-possible comparison 🙂
Any word on wether that
Any word on wether that display port adapter is active or passive?
Get a new monitor that has
Get a new monitor that has DisplayPort 1.2 or higher. Problem solved. Why would you waste money buying an adapter for an older, inferior monitor?
I have 4 displays, I’d like
I have 4 displays, I’d like to use all of them and am wondering whether I’d need to buy another adapter or use the one supplied with the card. What kind of pleb only runs one display? Do you even know how to PC Master Race?
Meh, my Titan X performs a
Meh, my Titan X performs a bit better as it will hit 2100Mhz on the GPU. The only real advantage here is the price point. For me, it’s a ‘no thanks’. I’ll wait for Volta to be released this fall 😉
I’m running a 8700K at 4.9
I’m running a 8700K at 4.9 ghz + 16 gig ram + ASUS gtx 1080ti Turbo (OC) + 2560×1440 res
Same settings as the review
In game benchmark always stays at around 90 FPS and no more
Any idea why ?