The ASUS GTX 950 Strix
For our review of the GeForce GTX 950 I used the ASUS GTX 950 Strix that NVIDIA passed to us initially. We also have units from Zotac and EVGA that will warrant a quick round up, so stay tuned for that.
The ASUS Strix derivative of the GTX 950 is going to retail for an MSRP of $169, but includes the added benefits of higher clocks and a fantastic custom cooling solution to the mix for that $10 difference over reference MSRP. These are some noticeable overclocks as well; the base clock speed is 1165 MHz while the Boost clock speed is at 1355 MHz!
The DirectCU II cooler on the GTX 950 uses a pair of fans that, in typical Strix design, don’t rotate until you get to a certain temperature level on the GPU, keeping the entire card silent during idle / Windows mode and even some of your more basic gaming scenarios. Two 8mm heatpipes draw heat from the GPU to the outside fins on the cooler. ASUS claims that the GPU runs a full 7C cooler than the reference design using their upgraded heatsink solution and our experience with this design is really quite good as well.
ASUS has integrated a 5-phase Super Alloy Power system for stability and reliability for overclocking the GPU further (which is definitely possible on this chip).
Our view of the backside of the card shows us a shorter PCB design than the front cooler would indicate. A single 6-pin power connector rests at the top of the card for the connection to your power supply.
Oh, and what’s this? The GeForce GTX 950 supports 2-Way SLI! While the GTX 650 and the GTX 750 Ti did not including support for NVIDIA’s multi-GPU technology, the company decided to include it here with the GTX 950. (Keep in mind that the GTX 650 Ti DID in fact support SLI though.)
The display output selection from ASUS on the GTX 950 strays from the typical NVIDIA pattern since the introduction of the GTX 980 and GTX 970. Instead it includes a two DL-DVI porst, one DisplayPort outputs and a full-size HDMI port. This should give users of the GTX 950 plenty of options for connecting an array of displays.
Wonder how my 2012 era GTX
Wonder how my 2012 era GTX 660Ti compares in performance to these new cards……
Yo! eh Yo yo! Know what i’m
Yo! eh Yo yo! Know what i’m sayin! I got one for free from a Nvidia BOSS I party with yo yo! Yeah! So I did not have to pay 4 shiz! And you do! But check it yo! I does it big with my 980 Ti I gots 4 free at the same party yo! Yeah! YO! YO YO YO YO YO YO YO YO!
The generosity must have been
The generosity must have been prompted by your contribution to the english language.
Small correction.
“The GTX
Small correction.
“The GTX 950 drops the GM206 from 1024 CUDA cores to 768, a decrease of 33%”
768 is 3/4 of 1024. So going from 4 to 3 is a decrease of 25%. From 3 to 4 (768 to 1024) would be an increase of 33%.
Why was the 4GB version of
Why was the 4GB version of the 370 tested as opposed to the 2GB version? The price point that you are showing for the 4GB version is probably where the 370X with 2 GB will be sitting and it will probably perform at a level above the 950 to make it worthy of that price point. Just saying.
http://cdn.wccftech.com/wp-co
http://cdn.wccftech.com/wp-content/uploads/2015/08/Discrete-GPU-Market.png
Another nail in the coffin for AMD it seems. They’re finished.
Ah, no. Not once DX12 picks
Ah, no. Not once DX12 picks up some steam. AMD CPUs and GPUs are dramatically boosted by that API. People are in for a rude awakening once they see how much faster Radeons are in DX12. They’re probably gonna be pretty annoyed that they forked out for nVidia cards just in time to see them performing equally or even slower than AMD cards that were supposedly inferior.
Where are the DX12
Where are the DX12 benchmarks? Anyone buying a new graphics card at this point deserves to know how much faster Radeon cards are in DX12 compared to nVidia. Buying on the basis of DX11 alone is extremely misleading, and will lead to tremendous resentment.
I agree, DX11 is on its way
I agree, DX11 is on its way out. Anyone who buys a card now will probably keep it for the next 1-3 years. DX12 benchmarks are very relevant with this in mind.
There are only benchmarks
There are only benchmarks available though. That’s not going to give you any sort of idea of what that’s going to translate to in FPS for any given game.
Planing for future growth isn’t a bad idea, but I’d imagine anyone buying one of these plans to use it today, and showcasing what it can do with today’s software makes a heck of a lot more sense. DX12 seems to show minor improvements all around. At most you can expect games to perform marginally better down the road.
I agree, DX11 is on its way
I agree, DX11 is on its way out. Anyone who buys a card now will probably keep it for the next 1-3 years. DX12 benchmarks are very relevant with this in mind.
“A slightly cut down GM206
“A slightly cut down GM206 allows NVIDIA to take better advantage of its supply pipeline”
My translation-Allows Nvidia to utilize broken 960 dies.
AMD has me really confused at the moment-We had
270X full die-270 cut down/broken die
280X full die-280 cut down/broken die
Where are the 370X and 380X-Are we only getting
broken/cut down dies?????????????????????
Jeez…for +$20 I’ll get the
Jeez…for +$20 I’ll get the 960