A powerful architecture
NVIDIA’s announced the TITAN Z in March and finally put it on sale in late May. We got our hands on one and put it through the paces to see how a $3000 graphics card performs.
In March of this year, NVIDIA announced the GeForce GTX Titan Z at its GPU Technology Conference. It was touted as the world's fastest graphics card with its pair of full GK110 GPUs but it came with an equally stunning price of $2999. NVIDIA claimed it would be available by the end of April for gamers and CUDA developers to purchase but it was pushed back slightly and released at the very end of May, going on sale for the promised price of $2999.
The specifications of GTX Titan Z are damned impressive – 5,760 CUDA cores, 12GB of total graphics memory, 8.1 TFLOPs of peak compute performance. But something happened between the announcement and product release that perhaps NVIDIA hadn't accounted for. AMD's Radeon R9 295X2, a dual-GPU card with full-speed Hawaii chips on-board, was released at $1499. I think it's fair to say that AMD took some chances that NVIDIA was surprised to see them take, including going the route of a self-contained water cooler and blowing past the PCI Express recommended power limits to offer a ~500 watt graphics card. The R9 295X2 was damned fast and I think it caught NVIDIA a bit off-guard.
As a result, the GeForce GTX Titan Z release was a bit quieter than most of us expected. Yes, the Titan Black card was released without sampling the gaming media but that was nearly a mirror of the GeForce GTX 780 Ti, just with a larger frame buffer and the performance of that GPU was well known. For NVIDIA to release a flagship dual-GPU graphics cards, admittedly the most expensive one I have ever seen with the GeForce brand on it, and NOT send out samples, was telling.
NVIDIA is adamant though that the primary target of the Titan Z is not just gamers but the CUDA developer that needs the most performance possible in as small of a space as possible. For that specific user, one that doesn't quite have the income to invest in a lot of Tesla hardware but wants to be able to develop and use CUDA applications with a significant amount of horsepower, the Titan Z fits the bill perfectly.
Still, the company was touting the Titan Z as "offering supercomputer class performance to enthusiast gamers" and telling gamers in launch videos that the Titan Z is the "fastest graphics card ever built" and that it was "built for gamers." So, interest peaked, we decided to review the GeForce GTX Titan Z.
The GeForce GTX TITAN Z Graphics Card
Cost and performance not withstanding, the GeForce GTX Titan Z is an absolutely stunning looking graphics card. The industrial design started with the GeForce GTX 690 (the last dual-GPU card NVIDIA released) and continued with the GTX 780 and Titan family, lives on with the Titan Z.
The all metal finish looks good and stands up to abuse, keeping that PCB straight even with the heft of the heatsink. There is only a single fan on the Titan Z, center mounted, with a large heatsink covering both GPUs on opposite sides. The GeForce logo up top illuminates, as we have seen on all similar designs, which adds a nice touch.
Titan Z cards will take up three slots in your system which also means that many mini-ITX cases will not be able to accept installation of the Titan Z. The R9 295X2 has a similar problem with its length and of course the added radiator. The NVIDIA Titan Z will likely fit in cases built for micro-ATX cases though without issue.
The rear of the card has a very thick and strong back plate that is used both has a heatsink and for design. It protects the back side components while adding strength to the unit for installation and even shipment.
Output options on the Titan Z are identical to other reference NVIDIA designs and include a pair of dual-link DVI connections, a full size HDMI and a full size DisplayPort. There is plenty of room for air exhaust as well which is important to get as much movement OUT of the chassis as possible.
I do think it's odd that NVIDIA chose to continue with just a single DP connection as this limits the support for 4K panels to one per Titan Z. If you were planning on picking up a set of three of the new ASUS PB287Q monitors for example, you need one NVIDIA card per monitor!
With a TDP of 375 watts, well under that of the R9 295X2, the NVIDIA GeForce GTX Titan Z requires a pair of 8-pin power connections. Now, the R9 295X2 also only required a pair of 8-pin connections but it was drastically going over the recommended spec for PCIe power draw.
Hey look, an SLI connection! That's right, for those of you that are feeling REALLY froggy you can get a pair of Titan Z cards and run in them in parallel, giving you four GK110 GPUs worth of performance!
I have heard people calling the Titan Z a 2.5 slot card, and from this image you can see where that line comes from. The cooler itself only takes up about 2.5 slots worth of space but with the bracket, you are going to be using up all three-slots.
Also worth noting is that the back plate on the Titan Z is thicker than others we have seen and it actually did cause the card to make contact with the very bottom of the IO panel on the ASUS Rampage IV Extreme X79 motherboard. It did not cause any specific issues, but just wanted to note it.
Under that shroud you'll find that single fan and a large array of heatsink fins that are more than capable of keeping the two GK110 GPUs cool during gaming.
Without its clothes, the Titan Z shows the immense complication of technical design that mashing a pair of huge GK110 GPUs on to a single card requires. You have a PCIe bridge chip in the middle, power management hardware resting between the two GPUs and 12GB of GDDR5 memory running at 7.0 GHz.
Now, let's see how it all stacks up to a pair of GeForce GTX 780 Ti cards in SLI and an AMD Radeon R9 295X2!
great article i see nvidia
great article i see nvidia trolls incoming 😀
No man I am I dedicated
No man I am I dedicated nvidia buyer and even I can see that this card is a huge rip off. At the price of this card you could install two titan blacks in your PC and get higher performance while saving money.
Nonsense! This thing is only
Nonsense! This thing is only about 600-700$ new now, and it wipes the floor with any titan, titan black, or titan z. The newer nVidia cards are probably better and I am satisfied with my GTX 970. But I’m curious to see what AMD is going to pull with their upcoming 300 series.
I was there when this was
I was there when this was benched!!
I can’t believe how much faster the 295×2 is vs not only the TitanZ but even 780TI SLI. Good job AMD. Now drop the price on the 295×2 a bit so normal people can afford them. 😛
Have to say the TitanZ looks amazing with the shroud off. Lovely heatsink design.
I would love a price drop but
I would love a price drop but they never will until Nvidia do something, if you had a product that was better than your only competitor and you had it for half the price you would keep it at that knowing it will sell aslong as you could, AMD are still a company not as price gougey as NVIDIA but still need to turn a profit 🙂
This card is nonsense for the
This card is nonsense for the NVIDIA’s commitment with PC Gaming.
If this video card is for computing, why not put into the Quadro brand? Is that’s something hard to do?
I hope that PCPer publish some benchmarks against Quadro cards.
Because Quadro implies a lot
Because Quadro implies a lot more than just high compute performance, at least from the driver side of things. Things like quad-buffered stereoscopic 3D in OpenGL professional applications, strict driver certification, and so forth. This is much closer to the Tesla line, except that Tesla implies no video outputs. The GeForce line is a way to get it out to the masses without making it a full-fledged Quadro part.
I can see why NVIDIA put it under their gaming brand, but that doesn't mean I think it is a good purchase, especially for gamers, when an equivalent amount of Titan Blacks (or 780 Tis if full speed, 64-bit computation isn't required) is so much cheaper. It's a pretty big commitment for the design and build quality of the Titan Z.
Though Quadro does imply the
Though Quadro does imply the higher compute and a high level of support for professional development apps, doesn’t that kind of reinforce the fact that the entire Titan lineup has never been “Gamer Friendly”.
Again I think the Titan lineup should be branded Quadro or Tesla, as those are the target audiences, I mean NVIDIA may sell a fair amount of these Titan cards, but it is somewhat of a slap in the face to call these “Gamer Cards” as NVIDIA does do.
Can’t be called Quadro
Can’t be called Quadro without the Driver support and certification that cost big bucks to develop and maintain.
Quadro brand is out of the question, these GPU are mostly for number crunchers that will use them round the clock and will pay the higher costs, dew to the Titan Z’s lower power usage, the Z still costs less than getting a Quadro and not using the expensive drivers, and the stock traders and scientific users don’t need the certified for professional graphics use drivers. 10 or more of these things running in a stock trading boiler room, and crunching away on stock derivatives, and millisecond stock trading will save enough in power to pay the extra initial costs, but it beats having to buy the even higher priced Quadros. They can be used for gaming with gaming drivers, and they do use less power, it just takes a single gamer longer to recoup the extra costs in power savings, compared to a scientific user that may have 100 or more of these running 24/7 in a cluster.
What Nvidia sells to gamers, of even their standard gaming GPUs, is mostly paid for by the Business and scientific customers, who buy Nvidia’s GPUs for GPGPU accelerators, and HPC/Supercomputing and pay the higher prices that actually fund the R&D that winds up in the consumer gaming SKUs, same goes for the enthusiasts CPUs that Intel sells, if gamers had to pay all of the R&D costs that the professional and business users cover, the actual costs of development, the gamers alone with out any other markets subsidizing the R&D, then most gamers would not be able to afford most mid range GPUs by todays standards.
Uncle Sam spends uber Butt loads of cash funding supercomputers, and Nvidia’s R&D budget is funded indirectly and sometimes directly by these gigabucks projects, who’s technology R&D filters down to the consumer divisions and into the consumer products. It has always been thus, for Nvidia, Intel, AMD, IBM…, gamers may let the marketing convince them that it is all about gaming, but reality is technology filters down to the consumer market, even more so for high tech than other products.
Slap in the face, let gamers pay the real costs of R&D, instead of Government, scientific, and business users, then see what a real slap is. The drug companies will buy Titan Zs buy the truckloads, of course they’ll get a bulk discount, for their protean folding clusters. The Titian Z will come down in price, when the next generation of GPUs hit the market.
Quadro is a line of products
Quadro is a line of products for 3d design, this is their origin, not the gpgpu market.
For this exist the Testla cards.
And then they are the Geforce cards, the 3D gaming and “allforone” solution of nvidia.
Titan brand is a hybrid between Geforce cards and Tesla Cards, the Titan cards haven’t the optimizations in the driver to accelerate special functions for 3D professional applications.
So the mention for the reason to not include the Titan with Quadro line is a non sense.
It’s more a Tesla card, but not completely, its driver is a geforce one without optimizations, and its gpgpu capabilities are almost the same that the Tesla cards, BUT this card disables some advance feature related to run in networks of Tesla cards (is more a solution for a workstation, not to make supercomputers or nets of GPGPU systems).
Is a madness buy one of this to only gaming, but the people are sometimes very crazy.
Quadro is the professional
Quadro is the professional drivers, and not so much of a hardware difference, maybe a slightly different BIOS, and some clock tweaking to minimize bit errors in the rendering, but not much difference in the hardware. That and the millions of extra man(or woman) hours that go into getting the Quadro brand’s graphics drivers certified to work a flawlessly as possible with the major third party graphics software applications. the Titan Z is not too different from the GPUs utilized in the Quadro line, maybe less clock tweaking, and BIOS quality control. The Quadros probably have better power distribution/filtering circuitry and high end capacitors, for more error free operation, and extra error correction for the Gddr5 memory. Titan Z is engineered for a different market than just the gaming market, but if some gamers have the cash why not try and sell some there also, but those scientific users and stock traders will certainly spend the bucks, and recoup the higher initial costs in power bill savings.
With Quadro/Firepro/WhateverPro it’s the graphics driver and graphics driver maintenance guarantee that you are buying, its like a service contract for software, it costs money to pay the software engineers, and millions of dollars are at stake, and people’s jobs(Graphics professionals) are made or broken on deadlines, serous deadlines, the show must go on, Trade Show, or other that needs the graphics.
finally. someone that can
finally. someone that can understand that titan is nvidia cheap version of tesla instead of quadro. i’ve seen many people compare titan with quadro saying titan (well the original one) is a good card for 3d artist without the need to spend a couple of thousands of dollar for quadro. and in the end the same people commenting did not understand why nvidia want to undercut their own Quadro line up with Titan. the truth is it doesn’t matter how good titan price/performance compared to quadro for people/company that need the certified driver for their pro application titan was never a choice. titan is for starter developer that interested developing their application using CUDA. once they software is fully develop they will substitute titan with tesla and have full pro support from nvidia.
No Titan Z, is not for just
No Titan Z, is not for just developers(CUDA, or others), and it’s not for pro 3d graphics Artists, that do not want to starve, and need the certified graphics drivers, some of those big ad graphics take hours to run on multiple Quadros, and a few dropped pixels will ruin a rinder, and show up big time on large tradeshow graphic, or ad. The Quadro brand is not about the hardware, it is about the professional graphics drivers, that cost as much as the hardware to develop and maintain. Titan Z, note the “Z” as that is the indicator of the different branding, it can be used for gaming, but it’s out there more for the scientific users, and stock traders, whose jobs, and livelihoods, do not depend on spotless rendering, and you can not photoshop or Gimp an artifact out of a fancy AA and AO, with ray tracing, 20 hour render, with fancy reflections inside of reflections, you have to rerun the render, so the Quadros, and FirePros are worth every red cent for those costly drivers, that and ECC memory. Titan Z is for number crunching, and lower power consuming 24/7 use, it can be used for gaming, with installed gaming drivers, but is for a different market segment than pro graphics or just gaming alone. they are not undercutting Quadro, Quadro is the pro drivers! Titan Z is for scientists and stock traders, and such, Gaming is an afterthought for Titan Z, if you can afford the bucks up front, if you are running 10 or more of these Titan Z cards, the power savings will pay for the extra cost over their usable lifetime, at 24/7 use.
Pro graphics users work on deadlines, just like news reporters, miss too many deadlines because of a bad render, and find another way to make a living PDQ!
I don’t know much of
I don’t know much of professional GPUs, but the pro cards also have ECC memory, don’t they?
I think the entire Titan
I think the entire Titan lineup should have been branded Quadro’s or Telsa’s, and be done with it, since they really are intended for CUDA development/professional development.
NVIDIA could/should? release a GTX 790, with two GTX 780 TI’s (GK110/3GB Each/Etc), at a $1500 price point, that is really targeted at gamer’s.
With all the caveats to the
With all the caveats to the Titan Z I can’t help but be impressed with the engineering of this card. Single fan cooling config and quiet performance with two gpu cores makes me as impressed as I was seeing all those memory lanes on a single PCB with the 295×2.
I suppose if you’re spending this amount of money on a PC what’s an extra £150 or so for a fc waterblock.
Incomplete review Ryan
You
Incomplete review Ryan
You left out temperature comparison and cold vs hot benchmark runs. You emphasized it on the last few video card reviews. Curious why you left it out this time.
The R9 295X2 does not suffer
The R9 295X2 does not suffer from the hot/cold issue at all. This was addressed in our launch review of that product.
In the video you mention
In the video you mention between the 780 Ti SLI & Titan Z
“When the GPUs are heavily loaded the clocks have to come down some”.
Is throttling an issues with prolong loading?
Also in the video you point that heat is a one of the 295×2 flaws.
You didn’t provide any data for either temp comparisons or clock behavior.
I have some data here on the
I have some data here on the Titan Z. I'm going to try playing with some overclocking options on the card and will make a post tomorrow.
It be interesting to know how
It be interesting to know how much cooler the Titan Z operates then the 295×2 given its flaw. Most sites report the 295×2 under 70C in Furmarks so the Titan Z must run really cool.
Can’t wait to see those numbers.
Temperatures at load non
Temperatures at load non overclocked:
295X2 = 65*C
Titan Z = 82*C
What the …. ?
Ryan said the
What the …. ?
Ryan said the 295×2 flaw was its temperature. How can the Titan Z be running hotter?
I hope he clarifies this with data because that’s not good if the 295×2 runs cooler then Titan Z and he is calling it a flaw.
heat output and temperatures
heat output and temperatures can vary I guess. However is it unfortunate that even when AMD has a superior product they still wont sell as many as nvidia at 2x the price. Incompetence or just bad luck.
AMD couldn´t keep this 295
AMD couldn´t keep this 295 cool on air, so they clatched a wacoo on, meanwhile being able to also clock it higher. If TitanZ would be water cooled, we would have another story.
Wouldn’t 2 Titan Blacks be
Wouldn’t 2 Titan Blacks be faster then the 780ti’s and maybe the Titan Z in gaming?
Thanks for no help. I found
Thanks for no help. I found out that 2 Titan Blacks in sli are faster then 1 Titan Z.
For $3000 card (cost twice as
For $3000 card (cost twice as much as R9 295X2)but for most benchmark, it lose to R9 295X2 ? i can say GTX TITAN-Z is a failure for gaming card, but for workstation ? i hope Pc perspective will try to benchmark this against $3000 Quadro Cards on 3D application.
this article lacks
this article lacks overclocking capability of the TitanZ, but hey still good to finaly see some benchs, Nvidia handled the defeat of TitanZ poorly, they could have just supplied samples instead of this idiotic reaction they had.
i hope they bring a 790 out as ryan said without double P, but cut the price, so that AMD drops too and have some affordable bi-gpu this season.
I don’t understand Nvidia’s
I don’t understand Nvidia’s mentality with this GPU.
Its a nice card, but, compared with the 295X2, its not as good.
Its also suffers from stuttering where as AMD’s GPU does not, how the tables have turned.
So AMD’s solution is just better, and yet Nvidia want to charge twice as much for this failed solution?
Do they think we are all dumb?
I would compare this Titan Z
I would compare this Titan Z to 2x780ti and 1 Tesla K40. Do you know how many Tflops is the peak double precision floating point and peak single precision floating point is on the Z? Is the GDDR5 capable of ECC? If this thing beats out a tesla K40 then $3000 is much cheaper than a $4000 K40 that is capable of no gaming or a K6000 that can only game as much as a 780.
It’s a shame you didn’t do
It’s a shame you didn’t do any 780ti SLI overclocking to compare.
You boost both to at least 1 ghz clockspeeds (making those equal to 295×2)and it will easily surpass the 295×2 at all resolutions unless vram really becomes a limiting factor.
google this: 780 ti sli vs
google this: 780 ti sli vs 295×2
simple fact is 780Ti sli doesn’t beat it.
I’m always at a loss for the “overclock them and they will win” statements. no Nv card runs in game at its baseclock. they all boost…all of them. so this isn’t 780Ti sli @ 876Mhz vs 295X2 @ 1018Mhz. more likely 780Ti sli @ 1000Mhz vs 295X2 @1018.
and for all the 295X2 is hot as hell folks…the 780Ti sli will be 10C hotter at those clocks.
TLDR: the cards (290X/780Ti) are clock for clock equals and the 290X2 is the fastest single graphics card…
Nope, clock for clock 780ti
Nope, clock for clock 780ti is BETTER then 290x. In fact, even at stock 780ti is better then 290x at stock, with lower clocks.
Agreed. Increasing clocks on
Agreed. Increasing clocks on the 780Ti has a massive increase in performance when compared to a similarly overclocked 290x.
Most benchmarks I’ve seen how
Most benchmarks I’ve seen how the 780ti SLI beat the 295×2 at every res with most games except 4K.
Boost clock is only 926mhz, add another 100 or get 2 of the ghz editions to make it a more fair comparison and 780ti sli is a guaranteed win.
And lets not ignore the fact you can overclock the 780ti’s even further to 1200mhz stable gaming on air, you can barely overclock the 295×2 as has been proven in all other reviews.
I’d like to see the
I’d like to see the benchmarks run after the cards are warmed and the clocks have settled. nVidia cards can maintain pretty high boosts for a minute or two, but tend to drop off pretty substantially after that until the temps settle.
I’m hoping Nvidia releases a
I’m hoping Nvidia releases a GTX 790.
Vast majority of gamers aren’t interested in double precision.
now this would be sweet. but
now this would be sweet. but at what price? $1500 and it’s awesomeness for days, and it would be THE card for MITX builds assuming 2 slot compatibility
Thanks for no help. I found
Thanks for no help. I found out that 2 Titan Blacks in sli are faster then 1 Titan Z.
i wish we could all stop
i wish we could all stop arguing about the titan line. it is true that there is people who would benefit from this inbetweener, and that it would be more suited under for instance the tessla line.
but nvidia made the choice to put this sub line under its gaming line, and actively market it at gamers, which means that this is how they should be judged.
A couple of Z’s with SLI and
A couple of Z’s with SLI and that basically kisses goodbye to the use of any of your other slots, or am I missing something?
2 titan blacks for price of 3
2 titan blacks for price of 3 is only way to describe titan z.
Actually 2 titan blacks in sli will OC better and run cooler.
You would have to be idiot to get this.
2 titan blacks for price of 3
2 titan blacks for price of 3 is only way to describe titan z.
Actually 2 titan blacks in sli will OC better and run cooler.
You would have to be idiot to get this.
2 titan blacks for price of 3
2 titan blacks for price of 3 is only way to describe titan z.
Actually 2 titan blacks in sli will OC better and run cooler.
You would have to be idiot to get this.
While the DirectX 11
While the DirectX 11 performance comparison can be appreciated, I think we would also like to see more OpenGL comparisons for balance sake. AMD has been known not to support OpenGL standards, and a consumer would want to know that a card will run the games they wish to play. This is also important for DirectX 9 games which many of the most popular games are still using that.
When testing games like Crysis 3 and others could you also post the settings so we know if any NVidia specific features such as tessellation or PhysX are enabled.
I do not favor one brand over another, but currently choose NVidia cards because they support features in the games I want to play while AMD does not at this time.
On the other hand, if I want a card for mining purposes I would definitely choose an AMD card, so maybe you should cover that kind of performance too. I am sure it is VERY important to some consumers to know that AMD is the CLEAR winner in this category of use in consumer grade cards.
I appreciate your testing methods, but they could still use some improvements by covering a wider range of uses and features each manufacturer provides.
When you test games like Skyrim also, you should perhaps mod the game (with the top graphical mods) the way most users still playing it would so you can ascertain performance in that scenario rather than the vanilla game which few if any are still playing it in that form.
Well still showing bias
Well still showing bias regardless. Microsoft stopped giving DX9 library support about 3 years ago, nobody less than the 10% of games released uses DX9, cause by the time you push the DX9 to the limits, you will be far more CPU bound than GPU bound thanks to the terrible overhead this API has. There are a couple of games that have a DX9 fallback code, but its only usefull if you don’t have DX11/DX10 capable hardware or don’t have the performance to run it, so there is no point caring much about DX9, cause today’s hardware is fast enough to run any DX9 game, heck even was able to max STALKER with the Complete Mod which uses DX9 and is one of the most demanding DX9 applications ever, and ran over 65fps most of the time with my GTX 560M. AMD with the latest driver supports OpenGL 4.4 which is no problem, they are slower to adapt OpenGL than nVidia, but is as capable. Crysis 3 has Tessellation on by default and you can even turn it off manually if you want, but there is no point disabling it specially when the game is playable on most high end hardware with no issues. PhysX is a mimick which eventually will die as only one game or two that uses it are released every year or every two years may be. I love the technology but seeing how much underutilized went, I ended off selling my secondary GTX 650 that I had for PhysX processing.
2 titan blacks for price of 3
2 titan blacks for price of 3 is only way to describe titan z.
Actually 2 titan blacks in sli will OC better and run cooler.
You would have to be idiot to get this.
I think they would have gone
I think they would have gone with plait or EVGA 780 ti
and titan z companies will o.c it soon
cause 295×2 is overclocked