GM200 Specifications
NVIDIA has let loose the GM200 GPU upon the world, in the form of the GTX TITAN X. It has 12GB of memroy…12GB!!
With the release of the GeForce GTX 980 back in September of 2014, NVIDIA took the lead in performance with single GPU graphics cards. The GTX 980 and GTX 970 were both impressive options. The GTX 970 offered better performance than the R9 290 as did the GTX 980 compared to the R9 290X; on top of that, both did so while running at lower power consumption and while including new features like DX12 feature level support, HDMI 2.0 and MFAA (multi-frame antialiasing). Because of those factors, the GTX 980 and GTX 970 were fantastic sellers, helping to push NVIDIA’s market share over 75% as of the 4th quarter of 2014.
But in the back of our mind, and in the minds of many NVIDIA fans, we knew that the company had another GPU it was holding on to: the bigger, badder version of Maxwell. The only question was going to be WHEN the company would release it and sell us a new flagship GeForce card. In most instances, this decision is based on the competitive landscape, such as when AMD might be finally updating its Radeon R9 290X Hawaii family of products with the rumored R9 390X. Perhaps NVIDIA is tired of waiting or maybe the strategy is to launch soon before Fiji GPUs make their debut. Either way, NVIDIA officially took the wraps off of the new GeForce GTX TITAN X at the Game Developers Conference two weeks ago.
At the session hosted by Epic Games’ Tim Sweeney, NVIDIA CEO Jen-Hsun Huang arrived when Tim lamented about needing more GPU horsepower for their UE4 content. In his hands he had the first TITAN X GPU and talked about only a couple of specifications: the card would have 12GB of memory and it would be based on a GPU with 8 billion transistors.
Since that day, you have likely seen picture after picture, rumor after rumor, about specifications, pricing and performance. Wait no longer: the GeForce GTX TITAN X is here. With a $999 price tag and a GPU with 3072 CUDA cores, we clearly have a new king of the court.
GM200 GPU Specifications
The basis for the new GeForce GTX TITAN X is NVIDIA’s GM200 GPU. This large, beastly part is based on the same Maxwell architecture that we know and love from the GTX 980, GTX 970 and GTX 960. It is also built on the same 28nm process technology that those same GPUs use, too. NVIDIA is not yet moving over to another process tech quite yet, despite the rumors that AMD is going to be migrating to 20nm on its next flagship GPU.
The GM200 includes 3072 CUDA cores, 192 texture units and a 384-bit memory bus. Clearly this GPU isn’t bluffing, it has some power.
TITAN X | GTX 980 | TITAN Black | R9 290X | |
---|---|---|---|---|
GPU | GM200 | GM204 | GK110 | Hawaii XT |
GPU Cores | 3072 | 2048 | 2880 | 2816 |
Rated Clock | 1000 MHz | 1126 MHz | 889 MHz | 1000 MHz |
Texture Units | 192 | 128 | 240 | 176 |
ROP Units | 96 | 64 | 48 | 64 |
Memory | 12GB | 4GB | 6GB | 4GB |
Memory Clock | 7000 MHz | 7000 MHz | 7000 MHz | 5000 MHz |
Memory Interface | 384-bit | 256-bit | 384-bit | 512-bit |
Memory Bandwidth | 336 GB/s | 224 GB/s | 336 GB/s | 320 GB/s |
TDP | 250 watts | 165 watts | 250 watts | 290 watts |
Peak Compute | 6.14 TFLOPS | 4.61 TFLOPS | 5.1 TFLOPS | 5.63 TFLOPS |
Transistor Count | 8.0B | 5.2B | 7.1B | 6.2B |
Process Tech | 28nm | 28nm | 28nm | 28nm |
MSRP (current) | $999 | $549 | $999 | $359 |
Essentially, the GTX TITAN X’s compute structure is exactly a 50% boost over the GeForce GTX 980 with just a slight reduction in clock rates at stock settings. You get 50% more processing cores, 50% more texture units, 50% more ROP units, 50% larger L2 cache and a 50% larger memory bus. Dang – that’s going to provide some impressive computing power, resulting in a peak theoretical throughput of 6.14 TFLOPS single precision.
During the keynote at GTC, NVIDIA's CEO quoted the single precision compute performance as 7.0 TFLOPS, which differs from the table above. The 6.14 TFLOPS rating above is based on the base clock of the GPU while the 7.0 TFLOPS number is based on "peak" clock rate. Also, just for reference, at the rated Boost clock the Titan X is rated at 6.60 TFLOPS.
A unique characteristic of this TITAN X card is that it does not have an accelerated performance configuration for double precision computing, which is something that both the TITAN and the TITAN Black had before it. The double precision performance is still a 1/32nd ratio (relative to single precision). That gives the TITAN X DP compute capability at just 192 GFLOPS. For reference, the TITAN Black has DP performance rated at 1707 GFLOPS with a 1/3rd ratio of the GPU’s 5.12 TFLOPS single precision capability. It appears that NVIDIA is not simply disabling the double precision compute capability on the GM200 GPU, hiding it and saving it for another implementation. Based on the die size, shader count and transistor count, it looks GM200 just doesn't have it. NVIDIA must have another solution up it's sleeve for Maxwell double precision compute.
Oh, and yes, before you ask me in the comments below, I have directly asked NVIDIA for comment about memory configuration concerns, in regards to the GTX 970. The TITAN X does not have any divided memory pools, the 384-bit memory bus is not sectioned off into any sub-groups that may or may not run at slower throughput. So, there’s that.
The GM200 is built on 24 SMX modules and this is the full GPU implementation so you should not expect another variant to show up down the road with anything higher than 3072 CUDA cores, unless the company spins a completely new GPU revision. Being that the GPU is still based on the 28nm TSMC process, and is compiled with 8 billion transistors, this is not a small component. Measured die size was 25mm x 25mm or 625mm2. The previous largest GPU we had seen is the GK110 used 7.1 billion transistors and had a die size of around 561mm2. (Note that NVIDIA sets the die size at 601mm2 based on measurements of 24.66mm x 24.38mm.)
Clock speeds on the TITAN X are lower than the GTX 980 at stock, as you would expect, but the differences aren’t drastic. With a base clock of 1000 MHz and rated Boost clock of 1075 MHz. The GTX 980 reference clocks are 1126 MHz/1216 MHz respectively resulting in a 12.6% decrease comparatively. The memory clock is still starting at 7.0 GHz, the same speed as the GTX 980 and even the previous GTX TITAN Black. The memory configuration is improved with 50% higher peak bandwidth, hitting 336.5 GB/s, which is higher than the 320 GB/s rated by the AMD Radeon R9 290X flagship.
Speaking of memory, the TITAN X will ship with 12GB of memory. Gulp. That is 3x the memory found on the GTX 980 and 2x the GTX TITAN Black which seemed crazy (at 6GB) when it shipped in February of 2014. I can already hear the debate and no, 12GB of memory is not going to be beneficial for gamers for quite some time – likely a time span even further out than the life of this GPU, to be fair. With the recently debate around the 3.5GB and 4GB frame buffers, we saw that most games available today, even when pushed with settings intending to increase memory usage, rarely extend in the world of 4GB. If you want to think really crazy, let’s assume you are planning on doing on 4K Surround gaming with three displays; you might be able to stretch that out to 7-8GB if you try real hard. Of course there is a secondary audience for this card that focus on GPGPU compute rather than gaming, where that 12GB of memory could be more useful, more quickly.
The only negative change on the GM200 (compared to GM204) is its rated TDP. With a listing of 250 watts, the GTX TITAN X is clearly going to run hotter than the GTX 980, which has a 165 watt TDP. (Of note, this is a 51% increase in TDP that is in line with the other specification changes.) Keep in mind that the Radeon R9 290X has a TDP of 290 watts, though we have measured it higher than that in our power testing several times. Both the new TITAN X and the R9 290X require a 6+8 pin power connection so it will be interesting to see how the real-world power consumption varies between these two GPUs.
The new GTX TITAN X shares the same feature set and capability as the GTX 980. That includes support for MFAA, VXGI acceleration, Dynamic Super Resolution, VR Direct and more. For more information on those features and what they bring to the table check out the links below.
Now, let’s dive into the design of the GTX TITAN X graphics card itself!
Not too bad. I really like
Not too bad. I really like the look of it. 4k performance is pretty good, but still not capable enough, not for $1000.
Seems 4K is still very much
Seems 4K is still very much the domain of SLI/Crossfire configurations. For high settings 60 fps at least.
All depends on managing
All depends on managing expectations. I ran 4K on a single 760 for a while. Fell on its face if you enabled MSAA, but on low-medium settings without, you could mostly maintain 60fps (to be clear, I bought the monitor for the clarity while running multiple VMs in my day-job, not for gaming, and anyone doing a lot of work on the desktop, I’d recommend it). With 4K, lines are sharp enough that losing MSAA isn’t the annoyance it is at lower resolutions. I’m now on a single 780, and I can run FXAA/MSAA with High-Ultra on most games. It looks pretty damn good. You just need to understand that 2xMSAA or 4xMSAA is going to make things unplayable fast.
You make good points that 4K
You make good points that 4K is playable with lower cards. The only issue I have is that I value the Ultra settings over 4k. I would rather see all of the visual eye candy that was developed with the game engine then run at 4K with a 1440p or 1080p this can be had now days with any single $350 and below card. I just don’t think 4K is worth it now.
Whats the probability of a
Whats the probability of a GTX 980ti that uses the GM200
I’d put money on a 6GB 980Ti
I’d put money on a 6GB 980Ti based on this chip right around when AMD comes out with the 390X later this spring/early summer. I’ve read rumors of a ~$700 price for that. That would be in line with the 780Ti launch. I hope we see $500-550 980Tis by the holidays.
I guess anybody that can
I guess anybody that can afford this will not mind that pascal, which Huang says will significantly outpace maxwell, will be released in 2016.
I was thinking about a year
I was thinking about a year ago that GM200 would be 20nm and would be a similar improvement to GK110 that it was to GF100.
Instead, Nvidia kept the GM200 on 28nm and decided to completely omit DP performance.
GM200 is a compromise from the start. Its basically Nvidias Haswell refresh.
The fact that GK210 and the K80 exist is further evidence of this.
I guess they need to release something between now and 2016 though. In reality this GPU seems to be the successor to the 780ti 6GB and not the Titan at all.
Im usually called an Nvidia fanboy and i can see these things selling well but i dont think its nearly as impressive as the Titan Black. Im actually glad i got such a good deal on my 780 when i did. This GM generation of GPUs isnt very impressive by comparison even though theyre faster.
GP100 should be the next GK110.
Great card! Great review! AMD
Great card! Great review! AMD could learn a thing or two on power consumption from Nvidia. I’m going to hate seeing AMD’s 600mm2(ish) card. 🙁
Not really at all. This has
Not really at all. This has been explained so many times over years and years of reviews of GPUs etc.
The higher the vram bit-rate the more power it needs. An this is also why AMD does so much better in 4k gaming benchmarks then Nvidia.
Google vram it to get alot more tech details on this. So stop complaining.
According to leaks the 390x
According to leaks the 390x will outperform titan x and pull inly 20 watts more power. The same sites that leaked accurate titan x specs many months ago, so the 390x leaks are most likely legit. Now what were you saying?
WE are saying that you are a
WE are saying that you are a lying bitch because you said months ago, we didn’t know the dang thing existed until recently. Plus the specs on new high end gpus are easy to guess. Also one correct estimate does not mean it’s credible at all and the statement is about different card vendors, what if they only have a Nvidia source but not an amd source? This would make anything that was amd on their site look like it was pulled out of their asses.
Comparing the Titan X to the
Comparing the Titan X to the 295X is like comparing apples and oranges! No matter what the 295X is a dual GPU. and the Titan X is ONE GPU!! Now take two take 2 Titan X cards against the 295X and there will be a big difference!!
There will be a big
There will be a big difference in price also.
The X almost matches the 295×2 @4k when overclocked. Pretty impressive if you ask me.
If price is a problem then
If price is a problem then get out of pc gaming and buy a console.
Oh please, you should
Oh please, you should probably get off the internet with that lame comment.
It’s looks like you’re
It’s looks like you’re another one that needs a console and dump your pc.
You’re an idiot. The
You’re an idiot. The conscientious buyer has always ruled the PC, not the brain dead moron who thinks bigger price = better. Don’t bring the console vs PC crap into this shitstorm of idiocracy you’re pushing.
Doesn’t matter if Titan X is a single slot single GPU design if it costs 30% more than a single card dual GPU solution that matches its performance. Charging +80% over the previous card for a product that gets +30-40% performance definitely forces this into an Apples to Apples argument.
It has always been the case
It has always been the case that the price/performance gets significantly higher at the high end. Look at Intelsat CPUs. For the highest end consumer part it is over $1000. Is it twice the performance of a $500 part?
It has always been the case
It has always been the case that the price/performance gets significantly higher at the high end. Look at Intelsat CPUs. For the highest end consumer part it is over $1000. Is it twice the performance of a $500 part?
i AGREE with you.
The little
i AGREE with you.
The little whining gasbags are wailing about the most expensive toppest end consumer gaming gpu in the entire world…
None of the whiners are ever going to buy one, for any reason.
It’s like taking a t-shirt and worn jeans 99%er out of their flop tent in protest park because they demand to go to the car show with you paying for it all and driving, then they get there and wail and moan the Lambo is overpriced…
This is what it’s like, they do it every time, every time they wail and moan on it.
yeah… but the 295×2 costs
yeah… but the 295×2 costs less and still performs better
It only performs better
It only performs better because it has 2 gpu’s if it had 1 it wold not perform worth crap.
Then it would a 290x then
Then it would a 290x then wouldn’t it? Great job informing us though AMD might have slipped that one past us if you weren’t on the job.
Yeah compare $2k to $699
Yeah compare $2k to $699 great idea. The downsides of the 295×2 due to it being a dual GPU have been fully covered in the article.
when the 295×2 came out it
when the 295×2 came out it was alot more than 699 get over it
Any word on the compute
Any word on the compute performance vs titan black?
Wouldn’t be good because of
Wouldn’t be good because of the fact that it does not have dual-precision anymore like the old titans, But what do I know? They came out with 4-way Titan-X Compute boxes built by nvidia
Confusion – Typo or badly
Confusion – Typo or badly signaled joke?
is this suppose to be a joke?
“So much in fact that I am going to going this data the PCPER ISU”
did you mean
“So much in fact that I am going to call this data the PCPER ISU”
Or did you mean it to be a joke by stuttering in the sentance
and just didn’t signal that it was a visual joke?
I’m not certain hence confusion and this question?
another small typo in the
another small typo in the conclusion
“The Titan X is built of gamers”
presumably
“The Titan X is built for gamers”
sorry to be a pendant – my spelling and gramme are often far, far worse
sorry to be a
pendant
– my
sorry to be a
pendant
– my spelling and gramme are often far, far worse.
Now there is what I call a giant typo that left the writer hanging.
Gtx 780 ti also pulled higher
Gtx 780 ti also pulled higher than 300 watts in some situations and power consumption was very near the 290x. Why not compare titan x to 780ti/titan black instead of getting your jabs in on amd? Change your name to Nvidia PcPR and it’ll make more sense. You’ve already got the green in your logo and everything.
“Of course there is a
“Of course there is a secondary audience that focuses on gpgpu compute”, this card is hardly capable of gpgpu without the double precision….at least any more so than what we already have on the market. Most of those are far cheaper than titan x. Why does this get a gold award? Because it looks pretty and costs a lot so it must be worth it I guess. I would buy this at $799, there is no reason for this to cost $1000.
Agreed. $750-800 tops.
Agreed. $750-800 tops.
“GM200 based GTX Titan Z…
“GM200 based GTX Titan Z… with 12 Gigs of memroy.”
That’s what the front page cover says. Must be pretty tired of playing with awesome hardware, Ryan.
Whoops, thanks!
Whoops, thanks!
My compliments on getting a
My compliments on getting a dig in about the price in your live video about the Titan. Didn’t think you would mention it. Very diplomatic…
I left a note in another
I left a note in another place about the competition but before we had been given the secret word I was flagging the typos here
have i stopped myself from being included?
If so that’s life but could you flag that one shouldn’t leave comments until you have the secret word, in any future competion
and yes I know what the secret word is and it isn’t red cause that would be far too AMD
hey ryan i seen ur live
hey ryan i seen ur live stream for the titan x giveaway came here from youtube chat was disabled so im new to ur website
Dead-on-arrival. Pointless.
Dead-on-arrival. Pointless. Useless. Dead.
Especially for that money.
You’ll be a completely retarded idiot to buy this now, even if you’re a heavy pro-nGreedia brainwashed fanboy.
4GB R9 390X already beats both it and the upcoming 980 Ti, not even mentioning the upcoming 8GB 390 X. 4GB R9 390X beat both the Titanic X and the 980 Ti even without new optimized drivers and other bonuses (like Vulkan/DirectX 12/Mantle, HSA, or stacked memory), while also obviously costing less money.
Titanic X, as well as 980 Ti, are complete and absolute failure. It’s very apparent and obvious now that TSMC screwed nGreedia truly hard. Much harder than we’ve thought initially. Even though nGreedia has more dough/better stock than AMD, it’s very clear that they’re seriously at least one whole year behind AMD now both the node and the technology-wise. And it’s very easily understandable that AMD will milk each and every last drop out of this opportunity that has been presented to it. 2015~2016 is completely the Radeon time now. You’ll be a total moron if you actually in all seriousness try to deny this fact in any way whatsoever, you’ll also be a complete idiot if you actually, in all seriousness, buy this dead-on-arrival Titan X crap.
Also, please remember this:
FOUR 980s BARELY beat one 4GB 390X. That’s how much more powerful 390X is. And it costs less money. There’s literally no reason for Titan X and 980 Ti existence. Absolutely. Completely. Totally. Undoubtedly.
What about for non-gaming
What about for non-gaming uses, like GPU computing, or pure rendering? I’d say this is a great card for those needs. So not exactly “DOA”
There’s one slight HUEG
There’s one slight HUEG problem with that though. Go look up how absolute majority of noVideots threat Huang’s Titanics and how nGreedia ITSELF promotes and advertises this crap.
It would’ve been absolutely fine if these were purely workstation offerings. But they’re seriously putting them through like gaming cards for gamers. Do you see the problem? Do you see it, huh? I surely do. And it’s a HUEG one.
This is actually a big win
This is actually a big win for nVidia, seeing as they will sell some to gamers as well as the compute oriented people who would be buying them anyways.
Um…I’m not afraid to stomp
Um…I’m not afraid to stomp on your balls, so I’m telling you this right now: it turned out this piece of trash is not even good for work. Because 0.2TF of Double Precision. This shit’s not good for ANYTHING, lel. Not good for games, not good for computing. This is literal DEAD.
Why don’t you use correct
Why don’t you use correct spelling and grammar in an argument instead of looking like a 10 year old who just discovered the internet on his dad’s computer.
STFU. lamer.
STFU. lamer.
Who would want a crap 4 gig
Who would want a crap 4 gig piece of junk amd oldcore ?
NO FUTURE PROOFING !
Now there”s no 3.5 gig whines !
12 gigs and no slant there !
BWHAHAHHAAHHA amd is dead.
Nice try, you paid Nvidia
Nice try, you paid Nvidia shill! Trying to claim that these “GTX 980” and “Titan X” products are somehow real! Charlie Demerjian has stated that Nvidia will actually never be able to launch these so called ‘products’ because they’ll have no money and will therefore be finished for good:
http://semiaccurate.com/forums/showpost.php?p=130385&postcount=38
AMD 390X has ALREADY launched with 4TB of ram and 5000x faster than your so called ‘titan x’ video card that certainly does not exist. This will also use less than 3W of power obviously.
Now how does it feel, being called out for the paid Nvidia shill that you are?!
Thanks for the laughs, d00d.
Thanks for the laughs, d00d. No, seriously, that was a pretty good one.
BTW FYI: I never used any offerings from nGreedia in my personal PC builds (which I have six of, in my house at this right moment) pretty much all the way since the GODLIKE GTX 285, the currently last GTX card I’ve bought without feeling like I’m gonna vomit. GTX 285 (the very first Twin Frozr) was the last truly great, absolutely worthwhile Nvidia card. Anything and everything else nGreedia shat out after that, was not decent enough for me to be interested in it highly enough. Yes, I’ve been using Radeons for a long time now, but that doesn’t make me “shill” or a “fanboy” as you might think there, because I’ve had enough practice with using products from both of the sides. Before the GODLIKE GTX 285, I’ve been using GS 7100, and before that – GeForce 2 MX 400 (and before that – some VooDoos). After the GODLIKE GTX 285 my route went HD 4730 (TUL) -> HD 6850 (Gigabyte Windforce) -> HD 7870 (MSi HAWK) -> R9 270X (TUL Devil) -> R9 290X (Sapphire Tri-X Vapor-X 8GB). And I feel absolutely great about it. Because nGreedia really let me down after GTX 285, while Radeons worked absolutely fine for so many years since the day-one I’ve started using them. I’d gladly try any new Huang card IF THEY WERE TRULY WORTH OF MY ATTENTION, but, so far, there was nothing there but just one massive letdown after the other from the green side.
That 4870X2 was a real winner
That 4870X2 was a real winner and 6870 Crossfire was even better! Glad I bought them. AMD/ATI has never let me down.
I really don’t know if I’m
I really don’t know if I’m supposed to take this as a sarcastic remark or not.
It was absolutely awesome
It was absolutely awesome paying twice the price for the same performance of a single GPU. I would sit wondering why frame rates looked good but performance was a stuttering mess.
So you ARE a troll after all,
So you ARE a troll after all, huh…good I’ve clarified. Now I know how to treat you.
Yes; If telling you about a
Yes; If telling you about a real experience is trolling.
I dare say I have owned far more ATI/AMD cards than you.
Perhaps I am just trolling a troll?
My deepest condolences to
My deepest condolences to your family/relatives, kiddo.
If I am trolling the question
If I am trolling the question is who am I trolling for given the cards I have owned?
Voodoo 2 12mb SLI
Riva TNT
GeForce Pro
GeForce 2 Ultra
GeForce Ti 200
GeForce 4 4400
Radeon 9700 Pro
Radeon 9800 Pro
Radeon X850 XT
Radeon 1800XT
Radeon 1900XT
GeForce 8800GTX SLI
Radeon 4870X2 + 4870
GeForce GTX 570 SLI
GeForce GTX 680 SLI
GeForce GTX 980
Feeling old..
Master Chen….You are a
Master Chen….You are a turd. You must be paid by NVIDIA to sound like a total idiot AMD fan. As an AMD supporter myself I am ashamed to be on the same “side” as you.
I’m on neither side, you lame
I’m on neither side, you lame stupid schmuck.
Cool! Where can I buy a 390X?
Cool! Where can I buy a 390X?
May/June. Across the world.
May/June. Across the world. Pricing? ~800$ depending on region and model.
It would be awesome if amd
It would be awesome if amd can one up these greedy motherfuckers. But $800 for a flagship is just as obscene to me. Of course I will say that because I cannot afford them and because, unlike great audio equipment that always sounds great, these things become middle of the pack in a few years as games become more demanding and monitors become more dense. But I do appreciate and agree with your sentiments.
That “~800$” was actually
That “~800$” was actually meant to point out price for 8GB version, though.
That too long to wait.
The R9
That too long to wait.
The R9 390X will be DOA since Pascal will be out not long after that and have at least 5 times the performance.
Hold your panties, Crysisboy.
Hold your panties, Crysisboy.
LOL, i love it, and I thought
LOL, i love it, and I thought i was an AMD Fan Boi.
Did you read what I’ve said
Did you read what I’ve said previously?
+1 For your knowledge about
+1 For your knowledge about nVidia Propaganda (its not a Marketing after nV 970 Lie/fiasco 😀 )
Still good Fermi Rev.6 !! And really nothing more.
I know cuz im a engineer (i know whats inside of this, so called new tech: Fermi/Keppler/Maxwell etc. all is based on same OLD Architecture -> tesselators, cache and Block build is the same (of course with major Tweaks 😉
I think AMD is better with the Superb GCN -> and Next-Gen R390X (i will have it 100% 😀 will be real game changer -> read Trully for 4k and beyoun Gaming, on 1 R3xx with HBM.
So the nGreedia Trols can go to cave now 😉
I find it VERY laughable that
I find it VERY laughable that only NOW, when 970 3.5GB scam opened up, noVideots were “shocked” and “confused” out of their sorry marketing victim asses, even though nGreedia did EXACTLY same crap in the past with 6xx.
Yeah the in-Famous 660Ti
Yeah the in-Famous 660Ti Lie/Fiasco lol
But i have Great Good nV in my shelf -> GTX 285 1.2GB 512Bit Monster DX10 !! that was a “mess” 🙂 And before that i have GF 6600 for NFS Underground 2 with latest DX9.c SM 3.0 ! ATI was a mess those times Only 9800 with old 9.b 🙁
But in the end AMD/ATI is my favorite GPU Maker tis dayz.
The 3850 ! the 7970Ghz now 280X 1075/1650 1.2v Whooa.
And now im waiting for True Power -> R390X HBM WC 8GB The King o’t’Hill. the 4k & 5k Mania will Begin shortly (maby 1 or 2 months)
VooDoo was in’da’house with Virge 4MB 🙂
Yeah my bro, We are Old Dogs 😀 We know what is created for Gaming and what is not meant to be Played(nGreedia) but Meant to be milked to tha last drop….
Yeah mby you tell another gaming brothers why Green Games are MESS 😉 The game Works DLL’s Fiasco !
BatMan and heavy tesselated coat (64MB in DLL For shader only to look good in benches compared to ATI 🙁
The tesselated Blocks (Cubes lol) in Crysis2 and of course invisible tesselation in that game also (disabled on nV cards lol)
And many more Bad deeds from nGreedia…. Makes me PUKE
Only $$$$$
G-sync (fiasco)
GameWorks (fiasco) even on nV cards lol
CUDA, PhysX (death to AGEIA, no respect for creators)
And another interesting
And another interesting feature from nG
You know what nV970 really is?
He He the Old Good 780Ti 3GB GDDR5 + 1GB DDR3 ! (not 0.5GB my bro)
Cuz in some scenarios problem was seen when VRAM exceed 3GB !
Some sooner, some later, depends on scenario 😉
So we have “Maxwell” 970 -> but real name is 780Ti with added 1GB DDR3 but on the BOX and in tha sop you have LIE !
it is 4GB GDDR5 🙁
But Fermi is too old for me ( -_- ) whatever the pseudo-marketing nG tell You…
Now New Brain wash with Titan-X
But is from not usable Quadro chips lol
New Box, New Propaganda and here we go. “Best” GPU (only $1k 😉
With cut down Power needs to maintain P/Wat new Propaganda.
But electricity is cheap…
So the 290X Sapphire or XFX with 8GB is for the Gamers NOW !
Wanna play 4k? Go buy 295X2 or wait for R390X HBM…
That’s ool folks
Um…since when there’s 8GB
Um…since when there’s 8GB XFX? I don’t remember that. If my memory is right, there’s only Sapphire out there who makes 8GB 290Xs right now.
XFX R290X Black 8GB
Maby is
XFX R290X Black 8GB
Maby is on Sale already, dunno
but it Exist 4 100%
Found it on Newegg. Wow,
Found it on Newegg. Wow, that’s a first. I seriously didn’t know about this one up until now.
You mean like AMD fanboys who
You mean like AMD fanboys who got worse performance with Crossfire than with a single GPU for years until AMDs lies were exposed?
I was always highly against
I was always highly against multi-card approach because I know everything about the problems this method causes, regardless of be that CFX or SLI. You really shouldn’t particularize – they’re both crap. I’ve always used only single-card solutions in my personal PC builds, so even if HD 6xxx line had some problems with CFX scaling or frame pacing (I think that’s what you’re referring to?), I wouldn’t know. The only 6xxx series card I’ve ever had was HD 6850 Windforce OC from Gigabyte, and it was THE best 6850 out there, out of all others present on the market, it performed like an absolute BEAST, so I wasn’t in any need of going HD 6870 or even HD 69xx, all the way until HD 7870 HAWK was released.
I always try to get the most best one out there, you see. I thoroughly examine offerings from all manufacturers out there and inside each particular manufacturer I always compare different offerings of different models (if any) of theirs, before I decide to buy any particular one. And, usually, I tend to get the best out of them all. For HD 6850 line that was Gigabyte Windforce OC, for HD 7870 line that was MSi’s HAWK, for R9 270X line it’s TUL’s Devil and for R9 290X line it’s Sapphire’s Tri-X Vapor-X 8GB. That’s what I’ve been getting lately. And never got mistaken even once, so far. Got the best single-card performance out there, all the time, so was in no need in going CFX. And now I plan to get the best 8GB R9 390X available out there, after it gets released and quality nonreference third-party solutions start flowing in (unless reference offering turns out ot be just that much better, I’ll always get nonreference), so that I’ll be getting the best single-card performance out there once again. CFX became pretty good recently, at least that’s what I’ve heard from people utilizing it, but I’m personally not ready yet to be using it in my personal rigs. I can build it for others, and I did that in the past already, but for me personally, in my own machine, that won’t be anytime soon. I’m just against the approach itself.
Im talking about the issues
Im talking about the issues covered on both pcper and tech report for the last few years.
SLI has hardware level frame time metering and actually works.
Even XDMA Crossfire doesnt work with DX9 and Crossfire now has software frame pacing after pcper and tech report demonstrated that Crossfire gave worse performance than a single card in almost every game tested.
I dont like multi GPU either but AMD was told for years that Crossfire didnt work and it wasnt until customers complained because pcper and tech report exposed their lies that they decided to fix it.
I hope that you DO know that
I hope that you DO know that several of the lately released batches of Radeon drivers (mainly Omega and different WHQLs) improved CFX’s quality significantly?
Not nearly as well since its
Not nearly as well since its just software implemented frame metering. And DX9 games still dont work with Crossfire.
You knew that right?
Keep telling yourself that, I
Keep telling yourself that, I guess. Nobody would take that away from you. Just remember: DefectX 9 (just as the 11-th is) is a piece of shit itself by default. CFX/SLI has little to do with it.
Keep telling myself
Keep telling myself what?
This website and Tech Report, as well as pretty much every other tech website that followed their lead has proven that Crossfire is inferior to SLI in every way.
Crossfire was only partially patched with software implemented frametime metering, which doesn’t work nearly as well as Nvidia’s hardware level frame metering.
It still doesn’t work as well as SLI even after all the patches, because it’s SOFTWARE FRAME METERING, which was only added to Crossfire after YEARS of people telling AMD that Crossfire gives WORSE PERFORMANCE than a single card.
So you can say whatever you want about how horrible Nvidia is, but it doesn’t change the fact that AMD lied about their multi-GPU being functional at all for years, and even now they still haven’t patched DX9 games.
That means you have to disable Crossfire and only use one card just to play a DX9 game, not because it won’t give you 2x the performance in DX9, but because you’ll be getting dropped and runt frames constantly, and get WORSE performance than a single card.
Yeah, AMD is wonderful with their two core per module CPUs that use 3x the power and are easily outperformed by Intel’s low end i5’s and their 600W water cooled GPU that gets worse performance than one of their 290x’s.
Although it was kind of hard to tell how well the 290x performed initially since AMD sent cards with “special” BIOS to review sites, and then got caught lying again when review sites purchased retail versions of their cards and it was proven that the press samples performed better.
Yeah, keep telling us how wonderful AMD is please.
>DAT damage control.
Just
>DAT damage control.
Just stop. No, really. Don’t embarrass yourself any more than you already did.
LOL what?
You’re the one here
LOL what?
You’re the one here telling us how dishonest Nvidia is and how wonderful AMD’s products are, when their shit doesn’t even work and hasn’t for years.
Why don’t you acknowledge the company that you shill for doesn’t make good products anymore?
Am I wrong or there is
Am I wrong or there is ANOTHER EIGHT pin connector that hasn’t been used, soldered onto the pcb… Maybe this card is using much more watts but it’s been cut down by some measure… Maybe they are returning to older style power hungry GPU’s…
Any news on SLI results from
Any news on SLI results from any sources?
Also- the 390x hypetrain needs to take a breather. It’s a bit deep in here with no actual, verifiable results to be found anywhere, don’t you think?
The Titan X is more of a
The Titan X is more of a marketing tactic rather than a real product. The fan boys and marketing people will be out in force, but very few people will actually buy this product.
That didn’t answer my
That didn’t answer my question you toolbag.
Actually,there is a
Actually,there is a leak
http://wccftech.com/amd-r9-390x-nvidia-gtx-980ti-titanx-benchmarks/
In short, it looks to be equal to a Titan X.
Of course, considering the history of such AMD leaks ….
I did notices today went
I did notices today went updating my Nvidia driver, they call the Titan X World’s fastest gaming GPU
Great write up Ryan. This GPU
Great write up Ryan. This GPU has some serious chops and all sorts of other goodness included. This should be whetting the appetite’s of enthusiasts and games everywhere. Great set of connection options on the card. The 12GB of memory does seem like overkill though.
From a GPGPU Computing stand
From a GPGPU Computing stand point. Would this be better than say the 295X or a dual 980 configuration? Looking into doing some high-end CUDA processing and not sure how things scale from that end with dual-gpu, and how the memory usage would compare. Obviously this would depend on the processing needs, but looking in a ‘general’ sense.
This will largely depend on
This will largely depend on how “parallel” your problem is.
First, if your problem domain requires Double Precision Floating Point, then none of these options are for you.
Second, which compute language do you want to use? Cuda is Nvidia only, so that limits your choices. OpenCL is supported by both, but not the latest versions. Nvidia support of OpenCL has been lacking in updates. I know their drivers support OpenCL 1.1, but can’t find reliable information about any of the later versions.
So, in terms of compute, if SP FP’s are fine, then currently the Titan X will be the way to go. Unlike games, that 12GB of memory can and should be very useful in keeping most of your data set into the GPU, which is one of the big slow downs in GPGPU computing. It also has the largest single GPU numbers, and it can support multiple cards in one system.
Keep in mind, GPGPU does not use SLI. But it can use multiple GPUs to perform calculations. It is up to you to use them effectively, but it can scale much better than SLI.
If you are performing the same calculations on multiple datasets, then it can be as simple as giving each GPU it’s own data set and go. If the data is in one set, you can break the calculations apart, but this is generally going to be slower than different data sets.
The 290X2 should just function like two 290X’s, but may not, since they are preconfigured in a Crossfire solution. I can’t tell you without trying it.
You might wait until the R9 390X comes out with 8GB of memory. My expectation is this will either cause Nvidia to push a new Titan dual GPU solution and/or lower prices across the board.
Exciting times.
Nice card for those who don’t
Nice card for those who don’t mind spending that amount of cash!
I have a 3GB GTX 780. The 980 and related cards aren’t fast enough to warrant an upgrade. This Titan X probably fast enough, but I’d never pay $1000 for a graphics card. My GTX 780 was $500 USD back in October 2013. Maybe we need some new AMD offerings and a price war? Perhaps a 6GB cutdown GM200 card?
It’s about the only way they’d get me to purchase a new card. I had thought of finding another 780 on eBay, but even in SLI I’m still limited to 3GB per card, which can be a bottleneck in future games at 2560×1440.
Might just be holding on to this card until the 2016 cards launch. It’ll be winter here soon before long and I can pull out the ‘max OC’ without worrying about the card getting too warm!
The 12GB might be needed more
The 12GB might be needed more than you think.
First, this card directly supports 5K displays, which will require more ram for games.
Secondly, the UHD standard has not been ratified yet, and one of the big pushes relates to higher bit rate displays. These so called “High Dynamic Range” displays may be capable of showing as high as 16bit per pixel color.
For games to take advantage of that, a large amount of the current 4bit compressed images will need to be higher bit rate. (What, you thought the compressed textures were 8bit?)
This is actually good news for gamers, as TVs are pushed out with these higher bit depths, the tech monitors should become available at cheap prices.