During a keynote presentation at GeForce LAN 2012 being held in Shanghai, NVIDIA’s CEO Jen-Hsun Huang unveiled what many of us have been theorizing would be coming soon; the dual-GPU variant of the Kepler architecture, the GeForce GTX 690 graphics card.
Though reviews aren’t going to be released yet, Huang unveiled pretty much all of the information we need to figure it out. With the full specifications listed as well as details about the stunning new design of the card and cooler, the GTX 690 is without a doubt going to be the fastest graphics card on the market when it goes on sale next month.
The GeForce GTX 690 4GB card is based on a pair of GK104 chips, each sporting 1536 CUDA cores, basically identical to the ones used in the GeForce GTX 680 2GB cards released in March. The base clock speed of these parts is slightly lower at 915 MHz but the "typical" Boost clock is set as high as 1019 MHz, pushing it pretty close to the performance of the single GPU solutions. With a total of 3072 processing cores, the GTX 690 will have insane amounts of compute horsepower.
Each GPU will have access to 2GB of independent frame buffer still running at 6 Gbps, for a grand total of 4GB on the card.
Sitting between the two GPUs will be a PCI Express 3.0 capable bridge chip from PLX supporting full x16 lanes to each GPU and a full x16 back to the host system.
In terms of power requirements, the GTX 690 will use a pair of 8-pin connectors and will have a TDP of 300 watts – actually not that high consider the TDP of the GTX 680 is 195 watts on its own. It is obvious that NVIDIA is going to be pulling the very best chips for this card, those that can run at clock speeds over 1 GHz with minimal leakage.
Continue reading for more details and photos of the NVIDIA GeForce GTX 690 4GB Graphics Card!!
A card with this much processing power is built for multi-display and 3D gaming so it will include a set of three dual-link DVI outputs and a single mini-DP output. Needless to say, this should be able to appease just about any gamer looking to connect up multiple displays.
The new cooler design on the GTX 690 is pretty impressive as well – NVIDIA is calling it the Ferrari of graphics card coolers. It is definitely unique and I can tell you that card is just as impressive as it looks in the images we were provided with. The specs sheets reads like a futuristic space shuttle parts list:
- An exterior frame made from trivalent chromium-plated aluminum, providing excellent strength and durability
- A fan housing made from a thixomolded magnesium alloy, which offers excellent heat dissipation and vibration dampening
- High-efficiency power delivery with less resistance, lower power and less heat generated using a 10-phase, heavy-duty power supply with a 10-layer, two-ounce copper printed circuit board
- Efficient cooling using dual vapor chambers, a nickel-plated finstack and center-mounted axial fan with optimized fin pitch and air entry angles
- Low-profile components and ducted baseplate channels for unobstructed airflow, minimizing turbulence and improving acoustic quality
What can you expect in terms of performance? NVIDIA is still staying quiet on that until the reviews release sometime down the line, but based on the specifications that NVIDIA provided, it should be pretty close to the numbers you might see for a pair of standard GTX 680 cards running in SLI. There are only slight clockspeed differences and the shader counts remain the same – this is NOT a pair of toned-back GK104 chips at all.
Here is a couple of quick slides of the GTX 680 SLI results we are working on here:
Surprisingly, NVIDIA does mention both price and availability of the GTX 690 – it will be available in "limited quantities" on May 3rd with "wider availability" on May 7th for a cost of $999. With the GTX 680 selling out left and right at the $499 price tag, selling two of them for anything less would probably be a dumb decision on NVIDIA’s part, I hate to say… But I have questions about the "wider availability" statement considering you STILL cannot find the GTX 680 in stock consistently anywhere.
Two NVIDIA GTX 690 cards in Quad-SLI
I am very eager to get my hands on this new GPU giant as it looks like it will easily be the fastest graphics card to ever grace our testing labs. Will AMD be able to do any damage now with the Radeon HD 7990 card that was promised to us last year? With the HD 7970 losing in single GPU results to the GTX 680, and if the performance of the GTX 680 is where we are guessing it to be, the HD 7990 will have to be something way out of left field to best or simply cut its price down a bit.
any announcement as to when
any announcement as to when we can buy this
Early May 2012
Early May 2012
“Surprisingly, NVIDIA does
“Surprisingly, NVIDIA does mention both price and availability of the GTX 690 – it will be available in “limited quantities” on March 3rd with “wider availability” on March 7th for a cost of $999. With the GTX 680 selling out left and right at the $499 price tag, selling two of them for anything less would probably be a dumb decision on NVIDIA’s part, I hate to say… But I have questions about the “wider availability” statement considering you STILL cannot find the GTX 680 in stock consistently anywhere.”
a typo in 2nd last paragraph how can video card announced today be available on 3rd March. LOL
fixed, thanks for the catch.
fixed, thanks for the catch.
Available March 3rd and 7th?
Available March 3rd and 7th? Don’t you mean May? Or do we have to wait until 2013…
I’ll just buy another 680
I’ll just buy another 680 when the prices come down later on. Its easier to sell two 680s than a 690 when MY next upgrade cycle begins.
If a new game even justifies
If a new game even justifies buying another 680 at all
Even a 680 can’t run every
Even a 680 can’t run every game at max candy at 2560×1600. Much less 5120×1600 or even 7380×1600.
Now let’s see what AMD comes
Now let’s see what AMD comes up with to compete with this beast.
If you buy one of these for
If you buy one of these for $1,000 (and I plan to — actually, I’m going to buy two of them so I can run quad SLI), you do not want to screw around with overclocking for the maybe 10% or so gain, when you’re risking warranty coverage.
If you buy one of these for
If you buy one of these for $1,000 (and I plan to — actually, I’m going to buy two of them so I can run quad SLI), you do not want to screw around with overclocking for the maybe 10% or so gain, when you’re risking warranty coverage.
Even though the card has 4gb
Even though the card has 4gb does it mean that it is still like a 2gb card since its split between 2 gpus?
Yes, I believe that is the
Yes, I believe that is the case. 2GB per GPU.
fucken yaaaaaaaaaaaargh
-from
fucken yaaaaaaaaaaaargh
-from your friendly neighbourhood oscar