Although there were quite a few rumors leading up to AMD’s Radeon 7000 series launch, the Internet has been very quiet on the greener side of the graphics market. Finally; however, we have some rumors to share with you on the Nvidia front. As always, take these numbers with more than your average grain of salt.
Specifically, EXP Review managed to uncover two charts that supposedly detail specifics about a range of GeForce 600 series Kepler cards from the number of stream processors to the release date. Needless to say, it’s a lot of rumored information to take in all at once.
Anyway, without further adieu, let’s dive into the two leaked charts.
Model | Code Name | Die Size | Core Clock (TBD) MHz | Shader Clock (TBD) GHz | Stream Processors | SM Count | ROPs | Memory Clock (effective) GDDR5 | Bus Width | Memory Bus Width |
GTX690 | GK110x2 | 550mm2 | ~750 | ~1.5 | 2×1024 | 2×32 | 2×56 | 4.5 GHz | 2x448bit | 2x252GB/s |
GTX680 | GK110 | 550mm2 | ~850 | ~1.7 | 1024 | 32 | 64 | 5.5 GHz | 512bit | 352GB/s |
GTX670 | GK110 | 550mm2 | ~850 | ~1.7 | 896 | 28 | 56 | 5 GHz | 448bit | 280GB/s |
GTX660Ti | GK110 | 550mm2 | ~850 | ~1.7 | 768 | 24 | 48 | 5 GHz | 384bit | 240GB/s |
GTX660 | GK104 | 290mm2 | ~900 | ~1.8 | 512 | 16 | 32 | 5.8 GHz | 256bit | 186GB/s |
GTX650Ti | GK104 | 290mm2 | ~850 | ~1.7 | 448 | 14 | 28 | 5.5 GHz | 224bit | 154GB/s |
GTX650 | GK106 | 155mm2 | ~900 | ~1.8 | 256 | 8 | 24 | 5.5 GHz | 192bit | 132GB/s |
GTX640 | GK106 | 155mm2 | ~850 | ~1.7 | 192 | 6 | 16 | 5.5 GHz | 128bit | 88GB/s |
From the chart above, we can see the entire lineup of Kepler cards from the NVIDIA GTX 640 to the dual GPU GTX 690. The die size in the higher end GeForce cards is approximately 50% larger than that of the AMD Radeon HD 7970, but not much bigger than that of the GTX 580. If only we knew the TDP of these cards! In the next chart, we see alleged performance comparison versus the AMD competition.
Model | Bus Interface | Frame Buffer | Transistors (Billion) | Price Point | Release Date | Performance Scale |
GTX690 | PCI-E 3 x16 | 2×1.75 GB | 2×6.4 | $999 | Q3 2012 | |
GTX680 | PCI-E 3 x16 | 2 GB | 6.4 | $649 | April 2012 | ~45%>HD7970 |
GTX670 | PCI-E 3 x16 | 1.75 GB | 6.4 | $499 | April 2012 | ~20%>HD7970 |
GTX660Ti | PCI-E 3 x16 | 1.5 GB | 6.4 | $399 | Q2/Q3 2012 | ~10%>HD7950 |
GTX660 | PCI-E 3 x16 | 2 GB | 3.4 | $319 | April 2012 | ~GTX580 |
GTX650Ti | PCI-E 3 x16 | 1.75 GB | 3.4 | $249 | Q2/Q3 2012 | ~GTX570 |
GTX650 | PCI-E 3 x16 | 1.5 GB | 1.8 | $179 | May 2012 | ~GTX560 |
GTX640 | PCI-E 3 x16 | 2 GB | 1.8 | $139 | May 2012 | ~GTX550Ti |
If these numbers hold true, NVIDIA will handily beat the current AMD offerings; however, I would wait for reviews to come out before making any purchasing decisions. One interesting aspect is the amount of GDDR5 memory. It seems that NVIDIA is sticking with 2GB frame buffers (or less) per GPU while AMD has really started upping the RAM. It will be interesting to see how this affects gaming in NVIDIA Surround and/or at high resolutions.
What do you guys think about these numbers, do you think Kepler will live up to the alleged performance scale figures?
Already got my ATI.
Already got my ATI.
Which card did you get? j/c
Which card did you get? j/c
I have the ATI Radeon 9700.
I have the ATI Radeon 9700. Like, the OLD one.
Oh man, I bet that thing
Oh man, I bet that thing plays Battlefield 3 in Eyefinity resolutions like butter 😀
the GK110 looks totally fake,
the GK110 looks totally fake, no way they can produce a 550mm2 without huge problems.
and for $650 give me a break!
The 690 seem like a rip off
The 690 seem like a rip off on paper. 999$ for a card that probably cant display 3 monitors by its self, they must really want you to buy 2 cards I guess… I am very disappointed seeing any card with less than 2GB of buffer coming out in 2012, regardless of how great the architecture is. 3 displays for gaming needs to become the norm already, in my opinion.
I thought the dual GPU cards
I thought the dual GPU cards were the only Nvidia cards that could do Nvidia Surround with just a single physical card (except those special MDT cards of course)?
My 590 supports three
My 590 supports three monitors just fine…not sure what that comment was about…love my 590, but I am salivating over the thought of two EVGA GeForce GTX 680 Classified Ultra Hydro Coppers…mmmmmmmmmmmmmmmmmmmmmmmm. Everything is better with EVGA! 🙂
Yeah, ever since XFX stopped
Yeah, ever since XFX stopped doing double lifetime warranty, I’m in the market for a new GPU vendor whenever I eventually upgrade. They are going the way of BFG most likely 🙁
I would think NVIDIA would be
I would think NVIDIA would be crazy to release even a single GPU card that couldn’t do at least three monitors…
Agree.
Agree.
I would be really shocked if
I would be really shocked if the 690 doesn’t do 3 displays. I am sure it will be part of the feature set for this card. NV isn’t stupid enough to expect people to spend 2000 dollars to run three monitors. Otherwise; I am not at all surprised that kepler gpu’s rather handily destroy AMD’s current lineup. The maxwell generation of gpus will widen the gap. I will happily step up my 3G 580 Classy to that kepler 680 2G’er FWIW this chart was out a few days ago, kudos to PCper for vetting it a bit before they re-ran it
Just agreeing with the two
Just agreeing with the two other responses, the GTX 590 DOES do 3 monitors as a standard feature, so it would be very fair to assume the “GTX 690” would also.
I would rather see 3gb of
I would rather see 3gb of vram per card because Battlefield 3 in 3D vision uses too much memory. My gtx 580’s are maxed out at 1.5 gb and I only get 40fps. Nvidia and dice need to fix this.
That is where EVGA will come
That is where EVGA will come in…they got your back! 🙂
You won’t get any more than
You won’t get any more than around 40 fps due to the rendering setup in BF3. The 3D vision driver in it’s current state can’t handle all effects within modern games. 3D vision seeks to limit the amount of work that needs to be done for 3D and in the process ends up taking a few short cuts. This screws up the rendering of a lot of effects in many games while saving power.
In order for BF3 to work in 3D the whole 3d vision driver has to be bypassed. And as it stands, unless you settle for Crysis 2 3D (mega crap IMHO) that means rendering whole frames for both eyes. In effect this will mean exactly half fps in 3D compared to 2D. Get 80 fps in 2D using two 580s? Well then you get 40 in 3D.
I run 2 580s in (3D vision) surround and this is exactly what happens when I switch to 3D.
I’ll probably get two
I’ll probably get two GTX670s. 1.75GBs should be enough Vram for 2560×1440 and less heat issues than the GTX690.
It’s going to be a tad faster than my Diamond Monster Voodoo II 12mb SLI combo. 🙂
true, you could prolly
true, you could prolly overclock the two individual cards further than you could the dual GPU card aswell 🙂
Will buy a 660TI in a custom
Will buy a 660TI in a custom edition w/ 2 or more gigs of RAM at the $300 mark. Else my $140 6870 is enough to keep me happy for another year.
For $140 i’ll just put a
For $140 i’ll just put a second 6870 in my system. One of those was a major jump from a 9800gt.
Wow that quite a price
Wow that quite a price difference between the GTX 650 and 650TI, better be worth it, still out of my price range. Guess Ill be getting a 650 when they start to hit 150. Unless the leftover 560’s get there first.
Then again I would rather SLI two 650’s somday…
Why cant they just make all of this easier.
nVidia has been working on
nVidia has been working on Keplar for far too long. I remember attending a CUDA seminar a few months before the 500 series of video cards were launched and that guy giving seminar showed us slides of keplar by mistake.
This seems more factual than
This seems more factual than all the BS floating around earlier. That the mid range would beat the 7970 and cost 300.
I agree 100%. There really is no point to have the top AMD or Nvidia cards without 2560 x X displays or multi monitor setups, so it will be interesting to see the difference with the extra 1GB will make a difference.
AFAIK, Nvidia will have their version of eyefinity on single card setups on Kepler.
Final interest will be over clock comparisons and.results.Good times!
heh, yeah I just don’t see
heh, yeah I just don’t see that happening 😛
Yep, hopefully they release them soon so that Ryan can get to work testing them at high resolutions! 😛
Really? That would be a positive move for them as AMD really has them beat on that front. No one wants to buy 2 Nvidia cards to do 3 displays 😛
What’s with the stingy
What’s with the stingy framebuffer sizes? Didn’t Nvidia learn anything with the insufficient VRAM of the 400/500 series?
So the GTX660Ti would come
So the GTX660Ti would come out 6 month later, cost the same, have less memory and only have 10% better performance than the 7950? I’m /impressed/ ! I also would have thought the GTX650 to cost less than that. I can already buy a card with these kind of perf. for that kind of price.
Indeed. I’m hoping once they
Indeed. I’m hoping once they are released they will knock the prices way, way down!
hope they hit the market
hope they hit the market earlier than expected. gpu budget is burning a hole in my pocket. seriously considering going ati. the waiting game sucks,lets play hungry hungry hippos!
I see a big price war on the
I see a big price war on the horizon…
$1000+ for a GTX690 with
$1000+ for a GTX690 with under 4gb of RAM? they must be on crack… AMD’s 7990 (their “flagship” card) cost around $150 less and gives you a 1/3rd more ram for a total of 6gb and almost double per GPU. (1.75gb vs 3gb)
+ the fact you can mine bitcoins and make your money back, yeah… I’m picking the HD 7990…. just for the fact I can make my money back and then some + my mobo that it will be going into isn’t a PCIe 3.0, so no loss of PCIe there.
the only thing Nvidia has going for it is the PCIe 3.0 and if I’m not mistaken only intel boards have PCIe 3.0 on them right now. I have yet to see any AMD based boards using it. I’m just not a fan of intel’s upgrade path with mobos along with the fact they cost a LOT more.