A Beautiful Graphics Card
It was a surprise when it launched, it was a surprise when it showed up on our door. How about a review?
As a surprise to nearly everyone, on July 21st NVIDIA announced the existence of the new Titan X graphics cards, which are based on the brand new GP102 Pascal GPU. Though it shares a name, for some unexplained reason, with the Maxwell-based Titan X graphics card launched in March of 2015, this is card is a significant performance upgrade. Using the largest consumer-facing Pascal GPU to date (with only the GP100 used in the Tesla P100 exceeding it), the new Titan X is going to be a very expensive, and very fast gaming card.
As has been the case since the introduction of the Titan brand, NVIDIA claims that this card is for gamers that want the very best in graphics hardware as well as for developers and need an ultra-powerful GPGPU device. GP102 does not integrate improved FP64 / double precision compute cores, so we are basically looking at an upgraded and improved GP104 Pascal chip. That’s nothing to sneeze at, of course, and you can see in the specifications below that we expect (and can now show you) Titan X (Pascal) is a gaming monster.
Titan X (Pascal) | GTX 1080 | GTX 980 Ti | TITAN X | GTX 980 | R9 Fury X | R9 Fury | R9 Nano | R9 390X | |
---|---|---|---|---|---|---|---|---|---|
GPU | GP102 | GP104 | GM200 | GM200 | GM204 | Fiji XT | Fiji Pro | Fiji XT | Hawaii XT |
GPU Cores | 3584 | 2560 | 2816 | 3072 | 2048 | 4096 | 3584 | 4096 | 2816 |
Rated Clock | 1417 MHz | 1607 MHz | 1000 MHz | 1000 MHz | 1126 MHz | 1050 MHz | 1000 MHz | up to 1000 MHz | 1050 MHz |
Texture Units | 224 | 160 | 176 | 192 | 128 | 256 | 224 | 256 | 176 |
ROP Units | 96 | 64 | 96 | 96 | 64 | 64 | 64 | 64 | 64 |
Memory | 12GB | 8GB | 6GB | 12GB | 4GB | 4GB | 4GB | 4GB | 8GB |
Memory Clock | 10000 MHz | 10000 MHz | 7000 MHz | 7000 MHz | 7000 MHz | 500 MHz | 500 MHz | 500 MHz | 6000 MHz |
Memory Interface | 384-bit G5X | 256-bit G5X | 384-bit | 384-bit | 256-bit | 4096-bit (HBM) | 4096-bit (HBM) | 4096-bit (HBM) | 512-bit |
Memory Bandwidth | 480 GB/s | 320 GB/s | 336 GB/s | 336 GB/s | 224 GB/s | 512 GB/s | 512 GB/s | 512 GB/s | 320 GB/s |
TDP | 250 watts | 180 watts | 250 watts | 250 watts | 165 watts | 275 watts | 275 watts | 175 watts | 275 watts |
Peak Compute | 11.0 TFLOPS | 8.2 TFLOPS | 5.63 TFLOPS | 6.14 TFLOPS | 4.61 TFLOPS | 8.60 TFLOPS | 7.20 TFLOPS | 8.19 TFLOPS | 5.63 TFLOPS |
Transistor Count | 11.0B | 7.2B | 8.0B | 8.0B | 5.2B | 8.9B | 8.9B | 8.9B | 6.2B |
Process Tech | 16nm | 16nm | 28nm | 28nm | 28nm | 28nm | 28nm | 28nm | 28nm |
MSRP (current) | $1,200 | $599 | $649 | $999 | $499 | $649 | $549 | $499 | $329 |
GP102 features 40% more CUDA cores than the GP104 at slightly lower clock speeds. The rated 11 TFLOPS of single precision compute of the new Titan X is 34% higher than that of the GeForce GTX 1080 and I would expect gaming performance to scale in line with that difference.
Titan X (Pascal) does not utilize the full GP102 GPU; the recently announced Pascal P6000 does, however, which gives it a CUDA core count of 3,840 (256 more than Titan X).
A full GP102 GPU
The complete GPU effectively loses 7% of its compute capability with the new Titan X, although that is likely to help increase available clock headroom and yield.
The new Titan X will feature 12GB of GDDR5X memory, not HBM as the GP100 chip has, so this is clearly a unique chip with a new memory interface. NVIDIA claims it has 480 GB/s of bandwidth on a 384-bit memory controller interface running at the same 10 Gbps as the GTX 1080.
Other than these changes, and corresponding improvements in texture units and ROP count, there really isn’t anything architecturally different in the Pascal-based Titan X over a GeForce GTX 1080. Just more, better and faster. If you are new to NVIDIA’s latest Pascal architecture, product features and what the move to 14nm nets them, you definitely should read our GeForce GTX 1080 review that covers all of that!
What will you be asked to pay for this performance? $1200, going on sale today, and only on NVIDIA.com, at least for now. Considering the prices of GeForce GTX 1080 cards with such limited availability, the $1200 price tag MIGHT NOT seem so insane. That's higher than the $999 starting price of the Titan X based on Maxwell in March of 2015 - the claims that NVIDIA is artificially raising prices of cards in each segment will continue, it seems.
The NVIDIA Titan X (Pascal) Graphics Card
Our time was short with the new Titan X, as our team prepares for a three week whirlwind of events, but we wanted to get a quick review of this beast out the door ASAP.
The new Titan X features the same design language started with the GTX 1080, a rif on the now aging design for NVIDIA reference products. This includes a blower style cooler with an illuminated GeForce GTX logo along the top of the card (interestingly, one of only a few places I see referencing GeForce with this product) and a window to see the heatsink under the shroud.
Rotating the card around the back we find a full cover backplate on the Titan X with an optional segment on the back half you can remove to improve airflow on adjacent graphics cards in SLI. The backplate even has a custom Titan X stamp on it.
Though the shroud design is shared with the GTX 1080, the Titan X goes with a black out color scheme and a chrome “TITAN X” logo along the front.
Display connectivity remains unchanged: three full size DisplayPort connections, one HDMI 2.0a and a dual-link DVI connection for legacy displays.
With a 250 watt TDP, the card includes both a 6-pin and an 8-pin external power connection. This is more than enough to hit 250 watts but allows the card to draw as much as 300 watts when overclocked.
Titan X (Pascal) includes a set of SLI connections to support the new high bandwidth SLI connections, though still only in 2-Way SLI officially.
Please consider testing ARK
Please consider testing ARK survival evolved in the future.. it is now in the top 5 played on steam and with its unreal 4 engine it crushes video cards.. even a 980ti OC won’t maintain 60 fps at 1920×1080 at max settings.
Considering that its early
Considering that its early access probably forever and totally broken its not a good benchmark.
It is a good power virus that doesnt get throttled though. It also maintains its arbitrary 100% GPU load even when Alt Tabbed.
Fair — Yes Early access
Fair — Yes Early access isn’t ideal.. and it’s not perfectly optimized.. but it is a very popular next generation title that people are playing for hundreds and even thousands of hours. They’ve also been working with Nvidia, AMD, and Microsoft a lot to get DX12 up and running (which has not happened yet unfortunately).
It IS a much more complete game than almost all other survival games out there, despite being early access though..
I wouldn’t say it’s broken though. If you have a good (fast) server, and good PCs to play it on you do not experience the hitching or other stuff. The 100% GPU load just means it’s rendering in the background. If you VSYNC it and your video card is faster than 60 fps, you’ll see <100% load.
I’d like to see it since it
I’d like to see it since it one of the few games I consistently play besides GTA V and Kerbal. Sure it can be a bit buggy but it looks fantastic using the Unreal 4 engine, like John said above DX12 coming soon. It swamps my 970 at ultra so I drop it down a bit, taking some advice from Sapphire Ed. Plus if they wanted to they could just host a game and not involve any networking.
Pretty sure its just that
Pretty sure its just that Autodesk Scaleform UI is a power virus actually.
is it hard and big work for
is it hard and big work for just show fps each gpu.
thouse lines way to show it,i s not good.
so,why,why not fps also??
There not gonna be a whole
There not gonna be a whole Titan X video episode with Tom? :p
Doesn’t appear so…I’m sure
Doesn't appear so…I'm sure you were just fishing for a contest to win a free one? 🙂
Busted!
No seriously, it’s
Busted!
No seriously, it’s always nice to talk with Tom, he is really technically prepared unlike most of the PR guys, he has been a microprocessor designer and own some IP too.
I always enjoy the PCPer interviews
Why are there no 5K tests
Why are there no 5K tests please?
Because 5K is totally
Because 5K is totally irrelevant and is never going to be widely adopted. Rec.2020 is only 4K and 8K.
In my opinion the price is a
In my opinion the price is a bit steep for a crippled GP102. My guess is that depending on improved yields and more importantly what AMD brings to the table, we’re gonna see a Titan X Black or similiar (if history is anything to go by) with a fully enabled GP102, and possibly somewhat higher core frequency than the current one. I dunno if it’s just me, but when I “know” there is a fully working GP102 with more performance out there, and the fact that they didn’t use it for their preminum gaming card, it just puts me off. I would never consider getting one.
Amd has a clock problem, even
Amd has a clock problem, even if they released a high end gpu on 14nm they would still be way behind Nvidia because their architecture cannot hit those high clocks. HBM2 will not solve this problem, Amd only hope is the new API’s. This is coming from someone with a r9 390x crossfire setup, amd needs a hail Mary card and they needed it yesterday.
High clocks do not equate to
High clocks do not equate to high performance, just look at AMDs DX12/Vulkan benchmarks at those lower clocks on the RX 480 compared to the GTX 1060. And also Look at HBM’s lower memory clocks and HBM’s much higher effective bandwidth than any GDDR DRAM clocked at 16 times the rate HBM, and 8 times the rate of HBM2! AMD’s Polaris hardware is doing more with its lower clocks relative to what Nvidia is doing with its much higher clocks. Nvidia is throwing more clocks at its hardware for some other reason, and that reason is the lack of some in hardware resources compared to AMD’s Polaris micro-arch.
Why don’t review site’s don’t
Why don’t review site’s don’t do any 1920×1080 benchmarks anymore? People that have a 120hz or 144hz monitor are interested in those benchmarks!
And you guys should redesign the website to a 1920×1080 format, these benchmark pictures look super tiny and are irritating to read on a 1920×1080 screen. I know you can use the zoom option, but the majority of the benchmarks graphs looks unsharp then! I wonder how 1440p and 4K monitor people are reading this review.
Steam hardware survey July 2016 http://i.imgur.com/nO2CfrI.png
http://store.steampowered.com/hwsurvey?platform=pc
37% people on steam still use 1080p screens
31% uses 4K
26% are still using 1366 x 768
1.31% For the 1440p!!! People
So please consider 1080p benchmarks for the future.
31% use 4k????? I will bet
31% use 4k????? I will bet anybody that number is wrong. There are likely at least ten 1080p screens for each 4k.
No he is correct for Feb
No he is correct for Feb 2015- July 2016. It includes multi-monitor though.
http://store.steampowered.com/hwsurvey?platform=pc
Over double video cards used were Nvidia 57% to 25%
3 Intel processors to every 1 AMD 77% to 23%
48% Win 10 and dx12. The top Vulkan card is gtx 970 at 10% Surprise. Guess Nvidia users aren’t going to let a few fps and lack of async support bother them.
I’m not doubting he read the
I’m not doubting he read the number correctly. I’m saying the number is not at all a representative sample of the population of steam users, much less all gamers.
The GPU and CPU numbers sound accurate to me.
Why am I not in any way
Why am I not in any way surprised that you took a comment thread about common gaming screen resolutions and injected your own personal Nvidia fanboy jab?
All the complaining you do about AMD fans, and yet you’re the one trying to start things.
This is a comment section for
This is a comment section for Nvidia Pascal Titanx why are there dozens of AMD fanboys commenting here. Indeed. AMD fanboys started trashing the Titanx as they do with any Nvidia product. AMD fanboys are always starting things.
It is incorrect. They are
It is incorrect. They are using data from two different catagories. Out of ALL MULTIMONITOR users 31% use 3840×1080. Completely different category than the single primary monitor category. I’m sure it would be 1-3% at best.
Yeah you can click on the
Yeah you can click on the primary monitor and it opens up with percentages. I’m guessing other is 4k at 2%.
A lot of laptops are used apparently. A lot of resolutions under 1080. 1366×768 is almost 27%.
I am guessing here but Steam probably counts only a true 4k monitor as primary display and only if using Windows in 4k resolution. 4k monitor may be set lower such as 1080 or 1440. So multi-monitor setups would count as 1080 or 1440 as primary displays. 4k is underrepresented because a guy with 2 1440 monitors or 3 1080 is still pushing the same amount of pixels as 4k. While it is not considered 4k primary display it is still 4k.
4k 144hz is outrageously expensive you can get multimonitor setup much cheaper.
No one that buys a $1200
No one that buys a $1200 video card is running at 1080p, at least they shouldn't be.
I did use 1080p for the GTX 1070, GTX 1060 and RX 480 reviews.
Unless they’re using it for
Unless they’re using it for VR. In which case it’s 19×10 with 90 fpw minimum. Elite Dangerous and ARK could use this card in VR 🙂
I am talking about 1920×1080
I am talking about 1920×1080 @ 120hz & 144hz. Both GTX 1070 & GTX 1080 were dipping below 120FPS in those 1080p benchmarks.
People that have a 1080p 120hz or 144hz monitor are interested in this videocard. How could you not see this coming?
Yeah I game 2560x1080p with
Yeah I game 2560x1080p with my 970 but realize the Titan X is really for people with 2+ screens or 1440-4k. Even the 1070 is almost 1080p overkill. I’m probably going to grab a second 970 once they drop below $200 since thats more than enough for 1080p.
You are aiming for gaming on
You are aiming for gaming on 60FPS, Yes i agree it’s easy for the GTX 1070 & GTX 1080 to run those. But I am talking about 120hz and 144hz monitors, can you hit 120FPS or 144FPS whit a GTX 1070 & GTX 1080 everything on ULTRA settings on 1080p!????
And SLI is not that easy, most of the games don’t support SLI. And when they support SLI there are all kinda graphics bugs that comes whit it. You are better of whit a single card then SLI.
OLD POST I know, but that’s
OLD POST I know, but that’s NOT talking about 4K monitors. It says “3840×1080” meaning there are 2x (1920×1080) monitors.
The AMD R9 card’s memory
The AMD R9 card’s memory clock is just 500 Mhz ?!?!? O__o
On the Fury cards, yes.
On the Fury cards, yes.
That’s for HBM, and HBM gets
That’s for HBM, and HBM gets 1024 bits wide of Data channel/s per die(each HBM die’s 1024 bit allotment can be treated as smaller independent channels), for the HBM die stacks on Fury cards.
So that’s why all the HBM2 production is going to the server/HPC/workstation SKUs from AMD and Nvidia. It’s 500 MHz for HBM and 1 GHZ for HBM2 and way more effective bandwidth than any GDDR5/5X that eats the power clocked at 6-8 GHz! The HPC/Server/Workstation market will get most of the HBM2, with both AMD and Nvidia only using HBM2 in their flagship consumer SKUs. The server/HPC/workstation SKUs have the largest margins so both Nvidia and AMD will be reserving HBM2 mostly for that market’s demand, and the power conscious server market will eat up HBM2 for the power savings alone, not to mention the high effective bandwidth that HBM2 will provide.
Also the JEDEC standard only deals with what is needed for 1 HBM/HBM2 stack to function so the server/HPC/workstation accelerator SKUs could potentially be made with 6 or 8 HBM2 die stacks, as that will be up to the GPU makers that currently utilize HBM2. So even more HBM2 production will go towards the HBM2 server/HPC/workstation needs of the market that can afford to pay the higher margins.
Poor amd:( the only release
Poor amd:( the only release this year is a ~$200 card and last years
4gb(in need of 8gb) 28nm fury. maybe in spring/summer 17 they can get out an 1080 competitor? i dunno, they need something on the gpu side.i mean nvidia’s running away with this thing.
Yes but the market for the
Yes but the market for the $200 cards is bigger than the market for the $600-$1000 cards, and Titan X is getting some extra “Founders” edition style pricing from your pockets directly to JHH’s large account! Nvidia is running away from some “thing” and that thing is lower pricing for sure, as that Founders edition pricing shows. Nvidia has FOUND a way to take more dollars and run to the bank to stash the cash!
Yeah but a good highend card
Yeah but a good highend card helps sell the lower end one’s.
Yeah and AMD is so benevolent
Yeah and AMD is so benevolent as well. Charging $850+ for the Rx 9590 because it was the first 5 gigahertz processor. Eventhough it’s performance was around a $340 Intel 4770k. LOL
How about Furyx? It was $650 at release but couldn’t match a 980ti for the same price. It matched up well with a 980 at $500. New tech with HBM that’s why it is still worth it right. Wrong Nvidia could have charged $800 for the time but didn’t. AMD wanted to release Furyx for $800-900 but Nvidia beat them to the market and set the price. You can thank Nvidia for getting you your Furyx at lower price.
It’s called capitalism. AMD does it too. AMD doesn’t set price low for your benefit/charity. It’s basically all the product is worth. They’d charge more if they thought they could get more. End of story.
Autocorrect changed my fx to
Autocorrect changed my fx to Rx lol. But seriously the same fx 9590 is available on newegg for $190 dollars now. AMD was overcharging heavily for it to begin with. A 75% price drop. Nvidia or Intel’s don’t have cuts that deep. You still can’t get an original Titan even for $250. Even used it’s $500. Good resale values are the hallmarks of a better or more desired product.
https://www.amazon.com/EVGA-GeForce-Dual-Link-Graphics-06G-P4-2790-KR/dp/B00BIUKH04
Or the hallmarks of salty,
Or the hallmarks of salty, stingy sellers.
Maybe in some cases people
Maybe in some cases people are greedy but if said product remains unsold usually price will come down. Some money is better than none especially if it’s just going to be sitting around.
AMD usually has a fire sale on their new products months later. Why anyone would buy their cards at premier when you know it will be about $50 or 25% cheaper a few months down the road. Especially when Nvidia/Intel releases new product or it isn’t selling well etc.
Nvidia doesn’t do many discounts except when their new line of cards come out. I used to buy at this point but what you really are getting is old tech that isn’t as good as the new one. You’re really taking a double step backwards. As new games come out supporting tech on new cards you’ll start to regret your decision.
I wonder why the clock speed
I wonder why the clock speed is lower, if they wanted to go beastly, why not go all out and improve each area?
Well when oververclocked it’s
Well when oververclocked it’s drawing 300W so i think they kept them (clocks) there because of that.
more cores = less OC room.
more cores = less OC room. More power consumption. I kinda figured they would have to drop the clocks on the bigger chips
Thank you for the review…
Thank you for the review…
And the shitshow that is the
And the shitshow that is the comments section of a graphics card review has begun.
This never happens with any other computer parts.
it’s just the passion
it’s just the passion man.thanks to no show from amd on the pc(CPU) side this decade. most comments are going to be on GPU’s. on the other hand the comment section on here is quite nice compared to say wccftech.
I long for a CPU battle
I long for a CPU battle again. I may be the minority, but I liked the CPU wars way more than the GPU wars. I remember the good old days, when you could “feel” the upgrade as soon as you booted into windows…
They are still happening with
They are still happening with supercomputers, although its more like x86 vs ARM vs SPARC vs GPU vs ASICs and DSPs etc.
Its way more exciting now than it ever has been too. There are actually 2.5D chips and silicon photonics that work now. Exascale computing is a new golden age.
Uh-huh ok..
Uh-huh ok..
It’s that gaming has a Floyd
It’s that gaming has a Floyd R Turbo problem, what with all the blood and gore games and the sports games and such attracting the folks with the mindset that GPU companies are sports teams! And even the GPU makers’ marketing departments are always pushing the Team concept to sell their products. It’s only going to get worse with the PC market shrinking every year, so the GPU makers can no longer rely on growth to take the edge off of any market share fluctuations! The PC market is now getting smaller year on year until it settles at the replacement level of sales where any new PCs/GPUs purchased will mostly be for replacement of older systems.
So with more people using tablets, and phones, and less PCs/Laptops expect there to be even more dog fights among fanboys and the hired astroturfers/marketing that egg them on. There is also plenty of change happening with the New graphics APIs coming online and offering a way for the games developers to manage any and all makes/models of GPUs plugged into a PC, ditto for the proper benchmarking software that is able to work with the latest APIs and games to properly be able to test any new GPU hardware! Things are about to get even worse for the next year until the proper testing/benchmarking software is fully optimized for the new APIs/software/hardware that GPUs are getting.
Add VR gaming into the mix and the fighting cats are going to be thrown in with the fighting dogs, along with Peter and that Chicken going at each other, and it’s looks like a 100% chance of fisticuffs for at least the next year and a half! It looks like more dog versus dog, cat versus cat, cat versus dog, and one big fat bloke versus one freakishly large chicken trading licks tit for tat with no end in sight.
You guys are missing the
You guys are missing the whole point of this card.
You need this kind of horsepower and possibly even more for playing graphically dense games in High-end VR (e.g. Vive, Rift) at ultra quality/oversampling without getting frame drops.
Nvidia seems to be pushing on
Nvidia seems to be pushing on the limits of pascal at this point. They’ve shown all their cards till volta. a 1080ti with fewer cores could clock higher and do a little better than this titan x but not by much.
Its now a question of what vega will bring with HBM 2 and its new architecture. If it doesn’t have higher clocks than what we saw with 480, I doubt it will beat a titan X. Its going to come down to how many cores they can fit in there. 6144 would be needed to reach the magical 14tflops number at 1200Mhz. Then they have to convert that to actual performance, which would be aided massively by dx12 and vulkan which should be in some more AAA games by then. Less than that and It will come down to price/perf
Which ultimately is all that matters. titan X for $1200 makes no difference to me if I can get 80-90% of the perf for $550. Sure no performance crown, but this is a business and we are consumers. None of us are in competition to win a gold medal on this. Make money, get the most for your money, that’s all there is here.
Double post – sorry!
Double post – sorry!
Actually, Nvidia hasn’t shown
Actually, Nvidia hasn’t shown all their cards. Case in point, the Tesla has HBM2 and not DDR5X (and thus a different memory controller), but yet uses the same exact Pascal GPU (without the two disabled SMUs). I really wish they had done that with the Titan X, but then it wouldn’t be $1200 anymore…lol.
Amd is unable to compete at
Amd is unable to compete at the high end, a month before the Fury x dropped Nvidia rolled out the the 980ti that when overclocked left the Fury X null and void. I am big Amd Homer and just hope they can roll out something in the $500 range that can compete with 1080, even if it’s dual core. What say you amd?
$1,200 for a harvested
$1,200 for a harvested 471mm2, LOL.
And what is this fake story about it being a bet with Huang and an engineer? Talk about blowing smoke up our asses.
The bet was for a dollar. I
The bet was for a dollar. I seriously doubt it was that big of a deal, but it was fun to use it for marketing. I wouldn’t worry to much about whether or not it really happened.
The fact remains, however, they crammed a lot of transistors into the 16nm FinFET and the teraflops they produce, for a marketable consumer product, was the point of the bet.
Seems to me it would be something an engineer would say to a CEO who is close to the tech, but not close enough to really know exactly what they can achieve in a given project.
C’mon Ryan, beat the World
C’mon Ryan, beat the World record 😀
http://www.evga.com/articles/01042/evga-hardware-breaks-world-records/
Very rarely do I see trolls
Very rarely do I see trolls come out for a motherboard, RAM or PSU review. GPUs and CPUs, though? It’s like a US political debate in 2016 (wait, that’s much worse.)
Not that much worse, no.
Not that much worse, no.
I wonder if GPU reviews will
I wonder if GPU reviews will lose the endless trolls commenting if/when AMD goes out of business. That, or we’ll start hearing about how Intel’s 2022 IGP will crush NVIDIA. ;D
AMD leading the pack with
AMD leading the pack with those HIGH frame times as always….. What gives with that shtuff…?! It is literally doubling the frame time compared to nvidia. I have a typical frame time of about 6-15ms (depending on game) and that is with Quad OG TITANs.