GK110 Makes Its Way to Gamers
NVIDIA is launching the GeForce GTX TITAN based on GK110 this week and we have all the details for you!
Our NVIDIA GeForce GTX TITAN Coverage Schedule:
- Tuesday, February 19 @ 9am ET: GeForce GTX TITAN Features Preview
- Thursday, February 21 @ 9am ET: GeForce GTX TITAN Benchmarks and Review
- Thursday, February 21 @ 2pm ET: PC Perspective Live! GTX TITAN Stream
Back in May of 2012 NVIDIA released information on GK110, a new GPU that the company was targeting towards HPC (high performance computing) and the GPGPU markets that are eager for more processing power. Almost immediately the questions began on when we might see the GK110 part make its way to consumers and gamers in addition to finding a home in supercomputers like Cray's Titan system capable of 17.59 Petaflops/s.
Watch this same video on our YouTube channel
Nine months later we finally have an answer – the GeForce GTX TITAN is a consumer graphics card built around the GK110 GPU. Comprised of 2,688 CUDA cores, 7.1 billion transistors and with a die size of 551 mm^2, the GTX TITAN is a big step forward (both in performance and physical size).
From a pure specifications standpoint the GeForce GTX TITAN based on GK110 is a powerhouse. While the full GPU sports a total of 15 SMX units, TITAN will have 14 of them enabled for a total of 2688 shaders and 224 texture units. Clock speeds on TITAN are a bit lower than on GK104 with a base clock rate of 836 MHz and a Boost Clock of 876 MHz. As we will show you later in this article though the GPU Boost technology has been updated and changed quite a bit from what we first saw with the GTX 680.
The bump in the memory bus width is also key, being able to feed that many CUDA cores definitely required a boost from 256-bit to 384-bit, a 50% increase. Even better, the memory bus is still running at 6.0 GHz resulting in total memory bandwdith of 288.4 GB/s.
Speaking of memory – this card will ship with 6GB on-board. Yes, 6 GeeBees!! That is twice as much as AMD's Radeon HD 7970 and three times as much as NVIDIA's own GeForce GTX 680 card. This is without a doubt a nod to the super-computing capabilities of the GPU and the GPGPU functionality that NVIDIA is enabling with the double precision aspects of GK110.
These are the very same single precision CUDA cores that we know on the GK104 part and as such you can guess at performance (based on clocks and core counts); and you'll have to do that for a couple more days still. (NVIDIA is asking us to hold off on our benchmark results until Thursday so be sure to check back then!)
While the GeForce GTX 680 (and the family of GK104/106/107 GPUs) were built for single precision computing, GK110 was truly built with both single and double precision computing in mind. That is why the die size and transistor count is so much higher than GK104, the double precision units that give TITAN its capability in GPGPU workloads are abscent in the GK104 part. And while most games today don't take advantage of double precision workloads we cannot discount the potential for future GPGPU applications and what GK110 would offer differently than GK104.
I asked our very own Josh Walrath for some feedback on the GK110 GPU release we are seeing today in particular how it related to process technology. Here was his response:
The GK110 is based on TSMC's 28 nm HKMG process. This is the same process used for the other Kepler based products that are out today. The 28 nm process first saw light of day in graphics with the AMD HD 7970 back in December, 2011. NVIDIA followed some months later with the GK104 based GTX 680. 28 nm has been a boon to the graphics market, but it seems that we are at a bit of a standstill at the moment. There have been few basic improvements to the process since its introduction in terms of power and switching speed, though obviously yields have improved dramatically over that time. This has left AMD and NVIDIA in a bit of a bind. With no easy updates due to process improvements, both companies are stuck with thermal and power limits that have remained essentially unchanged for well over a year.
The GK110 is a very large chip at 7.1 billion transistors. It likely is approaching the maximum reticle limit for lithography, and it would make little sense to try to make a larger chip. So it seems that GK110 will be the flagship single GPU for quite some time. Happily for NVIDIA they have made some interesting design decisions about the double precision units and keeping them from affecting TDPs when running single precision applications and games (which are wholly single precision). There is a slight chance for these products to move to a 28 nm HKMG/FD-SOI process which would improve both leakage and transistor switching properties. Apparently the switch would be relatively painless, but FD-SOI has not gone into full production at either TSMC or GLOBALFOUNDRIES.
We still have quite a bit more to share with you on the next pages though including details on GPU Boost 2.0, new overclocking and overVOLTING options for enthusiasts and even a new feature to enable display overclocking!
Yeah, but will it run Crysis?
Yeah, but will it run Crysis?
We’ll find out on Thursday.
We'll find out on Thursday. 🙂
no
no
Actually, the better question
Actually, the better question is: will it blend?
Crysis 3, more like. 🙂 Can’t
Crysis 3, more like. 🙂 Can’t wait to see the performance results!
Might the GTX Titan review be
Might the GTX Titan review be the first to use your new frame-rating performance metric? Can’t wait! 🙂
I can tell you right now that
I can tell you right now that it won't be the full version of our Frame Rating system, but we'll have some new inclusions in it for sure.
I am really looking forward
I am really looking forward to seeing how this card will perform with crysis 3
I think I’ll get one. I have
I think I’ll get one. I have no need for it, but it looks cool. So, from someone who knows very little about computers or gaming, I take it this would be “overkill” to have my buddy build me a new system around this?
Hope to see some GPGPU
Hope to see some GPGPU testing and comparsion to 680. I can assist with Blender3D as it support GPGPU rendering on CUDA cards if you guys want.. Need to find out if I need to start putting coins away for another puchase 😉
Point me in the right
Point me in the right direction!
+1 for blender render times!
+1 for blender render times! Go to blender.org to download blender (looks like it is down as I post this, probably for v2.66 release).
blendswap.com could be used to find stuff to render. That way the files would be publicly available to anyone who wanted to compare your times to their own. Just make sure the files use the cycles rendering engine (not blender internal).
Instructions at http://www.blenderguru.com/4-easy-ways-to-speed-up-cycles/ can be followed to swap between CPU and GPU rendering.
I’m looking to replacing
I’m looking to replacing three GTX580 3GB mainly for rendering. With 6GB memory and the massive amount of CUDA cores I’m really looking forward to Blender Cycles speeds, but also Octane, VRay-RT. This must be one of the usage scenarios for the Titan, rather than pure gaming.
Download blender
Download blender from:
http://www.blender.org/download/get-blender/
Windows 64bit install is the simplest.
The most common test for GPGPU rendering is to use the link below. “The unofficial Blender Cycles benchmark”
http://blenderartists.org/forum/showthread.php?239480-2-61-Cycles-render-benchmark
And then use the information provided by Riley:
http://www.blenderguru.com/4-easy-ways-to-speed-up-cycles/
*shrug* I guess its fun for
*shrug* I guess its fun for the reviewer to get something new in the office but this might as well still be a myth as 99.99% of users will never even think about getting something this expensive.
Give us something new that regular people can actually afford…
i agree with above. one of
i agree with above. one of these or two 670? i hope we see that comparison in your review.
670s are too cheap – you’ll
670s are too cheap – you'll see two GTX 680s though.
ah the nvidiots will gladly
ah the nvidiots will gladly put out $1000.00 for a graphics card that is nothing more than a 680 on roid rage… i know im not touching this card unless i win the lotto and get hit in the head with a brick making me mentally incapacitated.
If PcPer’ll have the time and
If PcPer’ll have the time and blessing of Nvidias highers and Geekbox Gods, make a review with a system comprised of these cards in sli and dual Intel Xeons and/or AMD Opterons.
Sweet mother of GOD!
I want!
Sweet mother of GOD!
I want! I can’t wait to see the benches. They outta make Titan version the norm for future generations to bring down the x80 series lol.
It would be nice is someone
It would be nice is someone would test this card with blender 3d to see how it rates for rendering against the Quadro GPUs that cost way more! Does anyone know of websites that benchmark open source software using GPUs, as most of the benchmarking sites are using Maya and other high priced software, and little opensource free software for their tests.
ALL that POWER from one Card,
ALL that POWER from one Card, shame about the price. This is is my Adobe Premiere Pro and After Effects dream card, plus gaming! Too bad for the price.
Anyhow, a lot of great features, cant wait for the review, great informative video Ryan!
I wish NVIDIA had been a
I wish NVIDIA had been a little more aggressive on the price but I still couldn’t have afforded it. Just upgraded both my and my wife’s system to 7970 GHzs because the bundle deal was too good to pass up.
That said, for all the people who say cards like this are overkill, my current Skyrim install would give this card a run for its money even at 1080p60.
Can I mount a 4-way Titan
Can I mount a 4-way Titan SLI?
I don’t know, that sounds
I don't know, that sounds uncomfortable.
AMD here is your competition,
AMD here is your competition, What do you have in mind.
“NEXT GENERATION” gameplay AMD!
This graphic card is going to pack a punch people!!!
What is this? Lucky i got a
What is this? Lucky i got a ps3 last week! it looks better and pre ordered crysis 3 and it has blu ray so its better graphics!
Lol… the sad part is there
Lol… the sad part is there are some people who actually believe that.
The sad part is people who go
The sad part is people who go from SD to HD and see no difference.
i will take out a loan to buy
i will take out a loan to buy this card! 🙂
Damn, I wish I could afford
Damn, I wish I could afford it. :'(
Can you try running Doom 1 on
Can you try running Doom 1 on that system when you get the chance? 🙂
Man I hope they are readily
Man I hope they are readily available I would like to at least get two of these for SLI I’m kind of worried about the limited run comments..?? I am running three 680sc on three 27` for a 6000×1200 hmmmm…. guess well find out what u guys think.