Manufacturers announcing 4GB versions of the GeForce GTX 960 has become a regular occurrence of late, and today MSI has announced their own 4GB GTX 960, adding a model with this higher memory capacity to their popular MSI Gaming graphics card lineup.
The GTX 960 Gaming 4GB features an overclocked core in addition to the doubled frame buffer, with 1241MHz Base, 1304MHz Boost clocks (compared to the stock GTX 960 1127MHz Base, 1178MHz Boost clocks). The card also features their proprietary Twin Frozr V (5 for non-Romans) cooler, which they claim surpasses previous generations of their Twin Frozr coolers "by a large margin", with a new design featuring their SuperSU heat pipes and a pair of 100mm Torx fans with alternating standard/dispersion fan blades.
The card is set to be shown at the Intel Extreme Masters gaming event in Poland later this week, and pricing/availability have not been announced.
The good news here is this
The good news here is this will put pressure on Amd to also lift the average memory per card. The bad news its going to be a while before there are enough 4gb cards to target games at. And worse yet NVidia and its partners can use this opportunity to raise the average price of the midrange cards without adding much benefit to consumers
Who would seriously consider
Who would seriously consider this after the whole 970 fiasco? There’s no telling how crippled this card could be.
Are you on crack? A slight
Are you on crack? A slight deception about the nature of the RAM, that is not a fiasco. It is a case of miss-communication at best and false advertising at worst. A fiasco is when a faulty air bag causes more damage that the crash, or when rat poison accidentally gets into baby food, or when budget paint sickens a whole class-room.
Get some perspective dude.
Now, now. Let’s not argue
Now, now. Let’s not argue semantics. Besides, your examples are merely based off what you think classifies as a fiasco (aka opinion). Even though you could same the same for me, that wasn’t even the point I was trying to make or the discussion I was looking for.
Want to start over?
The fact is the 960 is based
The fact is the 960 is based on a different GPU; GM206, that does not have any disabled units and is basically equal to half of the full GM204 that powers the 980.
The disabled units on the 970 are the cause of the memory partitioning approach Nvidia decided on. There is simply no evidence that the 960 will not be able to access all its memory at uniform rate.
Fair enough, well said. I
Fair enough, well said. I suppose the thing always missing that’s always missing in digital communication is the human touch, the tone of voice, facial and body signals.
Lets start over. 🙂
The 970 was a fiasco –
The 970 was a fiasco – falsely advertised features (ROPs, cache, and VRAM layout) that directly affect performance, albeit in limited real-world scenarios.
That said, a 4GB 960 doesn’t have these issues. You’ll get the 960 performance without limited VRAM. Makes 960 SLI much more interesting.
The 970 was false/misleading
The 970 was false/misleading advertising, wait until the lawsuits begin to be adjudicated, The Apple “USB 3.1 gen 1” fiasco is about to begin, and there is no reason to trust Apple, Nvidia, or any other maker of laptop/PC/mobile devices. The entire industry needs to be investigated by the FTC, and justice department. People who blindly trust any corporation’s, or company’s products, are only fooling themselves. Apple’s marketing Little white lies will get them in just as much hot water! There is no USB 3.1 Gen 1 connector/plug, there is only the USB Type-c plug form-factor/electrical standard, and the USB Type-c Gen 1(*) is the one connected to a USB 3.0 controller chip. The USB Type-c Gen 2(*) connector is the one connected to the USB 3.1 controller. There is no “USB 3.1 Gen 1” in the USB standards body’s specification. Marketing folks can never be trusted, they practice to deceive and obfuscate, and only fools blindly trust any company, let the buyer beware is always the mantra!
The proper labeling of the USB type-c plug specification should be USB Type-c Gen 1, or USB Type-c Gen 2. With the Gen 1, denoting use with a USB 3.0 controller(5Gbs), and Gen 2, denoting use with a USB 3.1 controller(10Gbs)!
* the USB Type-c plug standard is really fairly identical across Gen 1/Gen 2, the only real difference is the type of USB controller(3.0, 3.1) used, and some extra pins/etc. to accommodate the newer USB 3.1 controller/etc. The entire standard Type-c plug form factor/electrical specification, and controllers are backwards compatible with the legacy plugs’ electrical/pinout, etc. of the earlier USB specifications. Adaptors will be needed for legacy USB plug from factors, but that is always the case when the shape/size standard of the plug is different(micro USB, etc.).
The adaptor business is a big money grab from Apple, with the new 12 inch MacThingy, I guess Apple really wants to make that trillion dollar valuation, and there sure are enough fools waiting to be separated from their money!
That big yellow prophylactic
That big yellow prophylactic that you are wearing, is not allowing your single cell of gray matter to receive the proper ventilation to get sufficient oxygen, and dispel the carbon dioxide. The entire Tech industry is full of such lies, damn lies, statistics, and marketing. And oh do the fanboi slackjaws fall for the marketing monkeys’ drivel, and develop an unnatural love and affection for the many brands of devices these tech companies produce. Your continuous attempts at apologizing for your brand, and your falsely conceived ideas of brand infallibility, for which you have misguided loyalties, and high propensity towards the use of ad hoc fallacies mixed with copious amounts of straw man logic, has forever self-labeled you as the dictionary definition of such a fanboi.
Fiasco? Okay.
Nvidia made a
Fiasco? Okay.
Nvidia made a mistake. The intention of that mistake, no one really knows but Nvidia. So you can hop on the “thrash Nvidia band wagon” or, take with a grain of salt the info in this article.
All the years I was an AMD/ATI guy and then switched to a much better platform by Nvidia, this is the only severe “uh -oh” they have really made. Things happen and they are taking accountability without fighting back.
If you want to abandon, if you like their stuff, and head else where, then that’s on you. If you’re an AMD/ATI fanboy and you’re here to troll their (Nvidia) products, the only people you’ll fool or get riled up here will be children.
After seeing how an increase
After seeing how an increase in VRAM on a 580 from 1.5GB to 3GB has no effect at playable frame rates I see little point in cards like this.
I’d love to see PCper grab a few different cards and run some tests at different resolutions and settings to see what you guys find. I have a hunch that the extra one hundred dollars these cards cost won’t ever be worth it.
Actually, it does make a
Actually, it does make a difference when you are playing games at much higher resolution with much higher details enabled.
But the issue here is that when you use a single 960 4GB version, then it doesn’t really make sense because the card just isn’t that poweful enough to run games at higher resolutions with higher details to make use of the 4GB. These 4GB versions will really make use of the extra vram when put in SLI for higher resolutions with higher details. It’s kind of like someone who can’t afford the 980, so basically buys half of it now(aka 960 4GB version) and then a matching one later on.
4GB VRAM may only prove to be
4GB VRAM may only prove to be useful when running games with higher resolutions/higher details that does NOT saturate 960’s 128bit memory bus. This is especially true for 960 SLI.
In this case why wouldn’t you
In this case why wouldn’t you just go for a 970 or 980 then since you’ve blown WAY past budget pricing? I can’t imagine there’s legions of people with more expensive higher resolution monitors trying to save by gaming on a budget card.
The difference isn’t as great as you’d think. As the GPU ends up getting taxed to the point that even with a doubling of the VRAM you’re still at unplayable FPS.
I’d say that might be an
I’d say that might be an issue, but with the “compression” algorithms that both amd and nvidia uses, they are able to do a whole lot more than you think with a lot less bandwidth.
No one tested 4K performance
No one tested 4K performance when the 580 was released and games didn’t use nearly as much VRAM as they do today. The only game that showed improvements with 3GB back then was Metro 2033 – a game that gobbles up VRAM. Going from 2GB to 4GB today is a bigger deal than going from 1.5GB to 3GB four years ago.
It would be neat to see some updated benchmarks with the latest games and drivers and features like DSR, 4K, etc. to see how well (or bad) Fermi, Kepler, and Maxwell compare with difference memory sizes. (I don’t think Fermi can output 4K natively except 24/30Hz over HDMI, but it can internally render using DSR).
No they didn’t. However,
No they didn’t. However, they were tested with 2k resolutions which would scale comparably when compared to modern cards and resolutions.
Another game that showed the big gain was Total War 2. Even then the game was unplayable because the GPUs were tapped. Oddly enough on more modern settings the card with higher memory showed slightly worse performance.
I don’t think testing the older cards at 4k resolutions is going to show much besides how far we’ve come 😀
Even that would be worth it.
Even that would be worth it.
I think you’re forgetting an
I think you’re forgetting an important difference: texture size. Most games that came out when the 580 was released were still being optimized to fit on DVD and many games were ports of console games, so texture size was very small by today’s standards (256^2 and 512^2 were most common, most games maxed out at 1024^2). With digital distribution, unified memory, and the use of Blu-ray storage for PS4/Xbone, developers aren’t being held back nearly as much. They can provide lossless texture packs, higher detail models, etc. Modders continue to push the limits of memory usage as well. Crysis 3 utilizes 4096^2 textures and can support 8192^2. The move from 512^2 to 4096^2 require 64X the memory, before taking compression schemes into consideration.
While I understand that more VRAM = higher cost and more VRAM doesn’t mean better performance in many scenarios, it certainly doesn’t hurt anything to offer the option to customers that want it. Also, asking for more VRAM will only help drive down costs as production increases. There’s no downside in the long run.
If the card is fifty to sixy
If the card is fifty to sixy dollars more, doubling the VRAM is going to need to show about a 25% performance improvement across the board for performance cost to scale.
I also question how well this is going to work out on a 128bit bus. Here’s hoping PCper gets some in for testing.
Just wish Nvidia would hurry
Just wish Nvidia would hurry up and offer a price-cut on the 970 so we can all move on. By delaying like this all they are doing is adding more fuel to the fire. If they haven’t done it by the time the new Radeon cards are released I’ll be jumping ship..
The way I think about it is
The way I think about it is imagine a water tank (as 2GB VRAM)
And the max GPU clock speed as your tap (faucet)fully open
And the band width being the number of taps you have connected.
And the water throughput as graphics processing.
Imagine the mains water filling your tank (data to the card)
Even with your 1st tap fully open (max clock speed) you’ll never be able to drain the tank, its filling up faster than one tap (128 bit) can drain it. So there’s no point doubling the size of tank (4GB VRAM). You just get a build up of water sitting in your massive tank.
The only way to get more water through (process more graphics) is to add a second tap (256 bit) or to make your tap open wider (Overclock).
ie – Putting 4GB VRAM on a 960 is like doubling the size of your water tank with just one single tap attached. Pointless?
Feel free to add to / modify / correct my analogy
I know this is a ‘4gb Gtx 960
I know this is a ‘4gb Gtx 960 thread; however, since many here seem more intent to slam Nvidia for the ‘970 affair’ I feel a need to chime in. I am a consumer. I own and use GTX 970. It was a great graphics card when I bought it last November. It is a great card today. Yes, I am pissed that Nvidia mis-stated card specs at launch…but that does not change the fact that for the money there still is not a better performing card. The class action litigation will go forward, and many lawyers will make millions. We, the consumers, might get a 10 buck settlement or 20 buck credit to be used on a future purchase, if that. The reality is that we got what we paid for, and nothing less. If there are any damages the payment of such should be joined by all of the idiots who ‘tested’ the card, and failed to find the so called flaw. If you own a 970, then be happy because you have a great card!
I Want To Buy a GPU.
Which
I Want To Buy a GPU.
Which One is better GTX 960 4GB OR GTX 970.