So apparently NVIDIA and a single AIB partner, Gigabyte, are facing a class action lawsuit because of the GeForce GTX 970 4GB controversy. I am not sure why they singled out Gigabyte, but I guess that is the way things go in the legal world. Unlucky for them, and seemingly lucky for the rest.
For those who are unaware, the controversy is based on NVIDIA claiming that the GeForce GTX 970 has 4GB of RAM, 64 ROPs, and 2048 KB of L2 Cache. In actuality, it has 56 ROPs and 1792KB of L2 Cache. The main talking point is that the RAM is segmented into two partitions, one that is 3.5GB and another that is 0.5GB. All 4GB are present on the card though, and accessible (unlike the disable L2 Cache and ROPs). Then again, I cannot see an instance in that class action lawsuit's exhibits which claim an incorrect number of ROPs or amount of L2 Cache.
Again, the benchmarks that you saw when the GeForce GTX 970 launched are still valid. Since the issue came up, Ryan has also tried various configurations of games in single- and multi-GPU systems to find conditions that would make the issue appear.
The 970 is still a great card
The 970 is still a great card day in and day out for me. As a current G1 970 owner the only thing I am mad about is the fact that it was labeled as a 64 Rop card with the same L2 Cache as the 980.
Other then that I currently play at 1080p and games are pretty great for the resolution I play at. Even garbage games like ACU at 1.5 play amazing even when going into the “RamGate” v-memory usage.
I just feel Nvidia should give out game codes for games like The Witcher 3, Batman Ark Knight or other future TWIMTBP games for this mix up.
In the end, all people will
In the end, all people will see is maybe 20$ at most cause class action lawsuits are nothing but a scam purported by lawyers.
But it’s still a huge loss
But it’s still a huge loss for Nvidia cash.
Huge? That assumes that
Huge? That assumes that NVIDIA sold a lot of these. Most of NVIDIA’s cash flow comes from integrated, low-end, and mainstream parts. The margin drops as you go higher. At most this is a slap on the wrist, but likely will just be appealed into obscurity.
What integrated gpu part does
What integrated gpu part does Nvidia sell?
I think (s)he is talking
I think (s)he is talking about ION, witch is integrated on the mobo.
That’s OK, take a hurt in the
That’s OK, take a hurt in the wallet Nvidia, maybe even use the money from the lawsuit, to keep a permanent eye on Nvidia! Hell force Nvidia to buy back all the affected cards, and give customers a full refund. Even better, Force Nvidia to take some of their higher binned stock, and make an SKU that matches the original advertised specifications, and swap the gimped parts out. Fix it to fit your marketing claims. No gimped memory channels, and such, GimpVidia. The government should fine Nvidia the exact cost of the Volta GPU accelerators in the upcoming government supercomputer contract, with Nvidia earning not one red cent, in addition to paying the lawsuit claims!!! That will make the liars think twice, about any false advertising.
When laying out what happened
When laying out what happened stop leaving out the fact that the card has a 224bit bus width instead of a 256bit. This obfuscation song and dance is getting old guys.
The “miscommunication” between engineering and marketing excuse was beyond pathetic IMO.
Do engineers not look at these websites? Do they not check the specs and material (GPU architecture pamphlets) that they send to reviewers?
About NVIDIA facing a CAL, I
About NVIDIA facing a CAL, I have no comments… I’m between they deserve it and they don’t. Pretty much don’t care. But now Gigabyte…. Why may I ask…? What the hell do they have to do with the memory issue? Why not EVGA for example since they are the closest partner to nvidia? Just to clarify, no fanboyism in favor of gigabyte (i even prefer evga to gigabyte most of the times). Just playing stupid, but I guess this is how the system works
The person who started it
The person who started it bought Gigabyte products and these products where saying 4GB on the box. If others also take part in this lawsuit with cards from other manufacturers, other manufacturers will also see their names in the lawsuit.
Its nvidia who tells their
Its nvidia who tells their partners to print those specs on the boxes so again why the hell does gigabytes get mixed i to this? Its purely nvidias fault due to lack of communication with their marketing team.
Lack of communication with
Lack of communication with their marketing team?
Do you really believe that pathetic excuse?
AMD Radeon 290X
AMD Radeon 290X
Forget the performance. Look at those 94 celcious. Look at that GPU core speed. Benchmarks are not valid. Shame on you AMD.
Geforce GTX 970
Oh come on. OK they lied about the ROPs, they lied about the cache, the memory is segmented and a part of it is running at a much lower speed, the bus also is not a true 256bit. SO WHAT? Look at the performance. That’s what counts. All benchmarks are valid.
The temps hitting 94c was
The temps hitting 94c was only on the reference boards… My XFX R9 290 DD only gets to around 72c tops under full load.. So it seems to me that people like yourself don’t keep up with new info on AMD graphic cards.
Hell even the Sapphire R9 290x Vapor-X graphic card is beating out some of the special edition GTX980 cards, But alot of reviewers are not updating benchmark scores with the AMD Omega drivers.
So please before you bash on AMD performace/watts/temps/etc do some research.
We are saying the same thing.
We are saying the same thing. You didn’t understood what I wrote. If you have problem understanding at least look at my avatar picture. Is it green?
So please before you bash on my posts, use your head.
He was on AMD side when he
He was on AMD side when he wrote that comment. It is quite funny/informative when “really” read it.
Meanwhile, somewhere at AMD
Meanwhile, somewhere at AMD user’s house: http://www.youtube.com/watch?v=EIPggCgYK38
A key fact that hasn’t been
A key fact that hasn’t been picked up by anyone as far as i know, is what happens to those people who are using higher end ddr4 ram etc, both now and in the future? Some system RAM is already faster than that final 0.5gb of VRAM on the 970.. so why would they want to use it? There’s no way to stop using it, so those people are stuck with a throttle that isn’t even necessary.
Nvidia deserves the lawsuit IMO, this is gross negligence or an outright con.
Yes, the GTX 970 is a
Yes, the GTX 970 is a great card for the price, the performance is still nothing short of excellent for 1080p gaming and will likely remain that way at least into the immediate future. I should know. I own one.
That, however, doesn’t change the fact that Nvidia knowingly misled customers and reviewers for a long period of time. Come on, the engineering and marketing folks had a little oopsie that resulted in false specs being published and sent to reviewers? Please. If you believe that, I’ve got some lovely oceanfront property in Montana I’d like to sell you.
I don’t know why Gigabyte is being dragged into this and I hope everything works out for them but Nvidia deserves any and everything that happens to them as a result of this lawsuit.
The problem here is deceit
The problem here is deceit and lies, along with potential performance degradation down the road when games start needing the full 4GB of memory. Nvidia basically billed this as an underclocked 980 with fewer CUDA cores. Turns out the differences are a bit bigger than that.
People made purchasing decisions based on incorrect hardware specs. Some people don’t care and will buy another video card in a year. Some people do care because they wanted future proof hardware to last for a few years. There are enough complaints about stuttering and such playing in 4K that people who bought this card with the intent to migrate to 4K down the road should be concerned about it.
Whether or not it’s a good card is completely irrelevant. Nvidia lied about the hardware specs in a way that meaningfully impacts performance. They can argue all day as to what percentage of performance loss exists, but the fact is that there IS performance loss because of it is what really matters.
Future proof and gpus is
Future proof and gpus is generally not possible. It is only in the last few years that thing have slowed down enough to even consider future-proofing. This is due to gpus being stuck at 28 nm for several years. There should be a big performance jump once we actually get 20 nm or smaller products. Also, the HBM and/or HMC cards should leave current cards far behind. My general advice is to get a good display, because you will probably keep it for many years and it is what you actually look at. For the actual computer, going high end is rarely worth it.
Trolling Lawyers looking for
Trolling Lawyers looking for clients
Here’s the part I still don’t
Here’s the part I still don’t understand. There are smart people at nvidia. They know what geeks want, they understand their customer-base pretty fucking well. Why would they decide to defraud the customers. Why would they lie? They MUST have know that there are thousands of geeks in the world who want to take EVERYTHING apart, break everything down to its most basic elements and learn every available fact about everything, some of whom make a living from being that curious.
No-One with 2 brain-cells is going to lie to ENTHUSIASTS and expect them not to find out. So the only possibility’s seem to be: 1)Internal communication error, 2)Gross incompetence leading to choosing deception, or 3)This is part of some bigger plan, perhaps a good plan perhaps not.
I feel like this time next year everything will be back to normal, AND every article about nvidia will have 1/6th of the posters bringing this up