Multiple monitor and 4k testing of the ASUS STRIX GTX 780 OC cards in SLI is not about the 52MHz out of box overclock but about the 12GB of VRAM that your system will have. Apart from an issue with BF4, [H]ard|OCP tested the STRIX against a pair of reference GTX 780s and HD 290X cards at resolutions of 5760×1200 and 3840×2160. The extra RAM made the STRIX shine in comparison to the reference card as not only was the performance better but [H] could raise many of the graphical settings but was not enough to push its performance past the 290X cards in Crossfire. One other takeaway from this review is that even 6GB of VRAM is not enough to run Watch_Dogs with Ultra textures at these resolutions.
"You’ve seen the new ASUS STRIX GTX 780 OC Edition 6GB DirectCU II video card, now let’s look at two of these in an SLI configuration! We will explore 4K and NV Surround performance with two ASUS STRIX video cards for the ultimate high-resolution experience and see if the extra memory helps this GPU make better strides at high resolutions."
Here are some more Graphics Card articles from around the web:
- ASUS GTX 780 Strix 6 GB @ techPowerUp
- MSI GTX 780 Gaming 6 GB @ techPowerUp
- HIS R7 260X iCooler 2GB GDDR5 Video Card Review @ Madshrimps
- XFX Radeon R9 290X Double Dissipation 4GB @ eTeknix
- PowerColor Devil 13 Dual Core R9 290X 8GB Review @ OCC
- PowerColor Devil 13 R9 290X Dual Core Review @ Hardware Canucks
- XFX R9 280 Black OC Edition @ Kitguru
- HIS Radeon R9 280 IceQ X² OC 3GB @ Benchmark Reviews
- ASUS R7 260X DirectCU II OC @ [H]ard|OCP
Two of these cards in SLI
Two of these cards in SLI doesn’t mean you will have 12GB of VRAM. SLI does not double your memory. You’ll still only have 6GB of VRAM.
Thanks for telling us
Thanks for telling us something everyone already knows
Obviously not everyone knows
Obviously not everyone knows because the article says 12GB of vram. congrats on not reading the article.
So? It’s technically
So? It’s technically correct. 12GB of physical VRAM.
There is pysically 12GB of
There is pysically 12GB of RAM in that setup, but only 6GB usable which is why it is pointed out that 6GB is not enough for Watch_Dogs.
If I'd said the cards only had 6GB both times I mentioned it someone would have complained that there was actually 12GB but it wasn't all usable. I've seen it in past comments for reviews of laptops with 32-bit OSes and 4GB+ of RAM.
I chose to try to annoy both types of complainers at the same time, as usual.
Thanks for reading it!
they should make a 780 ti 6GB
they should make a 780 ti 6GB
OR MAKE THE FREAKING 790
OR MAKE THE FREAKING 790 ALREADY!
Out of all games to test
Out of all games to test farcry 3 this is one of the worst games with SLI its not even playable with the amount of stuttering/frame drop even at 1080
2 780 sli is not enough for
2 780 sli is not enough for 4k if you like eye candy and decent framerate.
So sick of 4k nothing justifies this investment the amount to spend on GPUs and for the monitor is retarded The only way to get 4k playable with most games is with SLI/CF scaling is a joke in most game as of late can barely keep consistent 120fps with 2 780s at 1080 .
People focus too much on the
People focus too much on the newest, most demanding games. Sure, 4k is expensive if all you do with your computer is play those at maximum detail settings, but for absolutely everything else you could do with a computer (slightly older games, new but slightly less demanding games, and anything other than gaming) 4k is great and a single 290 or so is sufficient.
i have high standards i guess
i have high standards i guess
look at the review they are
look at the review they are not maxing out those games and cant even keep get 60 fps
And why not focus on most demanding games its called future proof rig so you dont have to upgraded year later.
thing is though with 4k ATM you cant really future proof unless you want to throw money out the windows and invest in 3 ti’s blacks and like said before sli/cf are hit and miss even more so with scaling too many unoptimized games running on older engines that bottleneck hardware potential.
Maybe next time around with AMD/Nvidia things will be better but make sure you wait year after release just to get the proper full potential GPUs…mainly with nvidia