A pair of R9 380X's will cost you around $500, a bit more $100 less than a single GTX 980Ti and on par or a little less expensive than a straight GTX 980. You have likely seen these cards compared but how often have you seen these cards pitted against a pair of GTX 960's which costs a little bit less than two 380X cards? [H]ard|OCP decided it was worth investigating, perhaps for those who currently have a single one of these cards that are considering a second if the price is right. The results are very tight, overall the two setups performed very similarly with some games favouring AMD and others NVIDIA, check out the full review here.
"We are evaluating two Radeon R9 380X video cards in CrossFire against two GeForce GTX 960 video cards in a SLI arrangement. We will overclock each setup to its highest, to experience the full gaming benefit each configuration has to offer. Additionally we will compare a Radeon R9 380 CrossFire setup to help determine the best value."
Here are some more Graphics Card articles from around the web:
- With Pascal Ahead, A 16-Way Recap From NVIDIA's 9800 GTX To Maxwell @ Phoronix
- Maxwell Quadro For All: NVIDIA Quadro M2000 Workstation Graphics Card Review @ Techgage
- Desktop Graphics Card Comparison Guide @ TechARP
- PCI Express 3.0 vs. 2.0: Is There a Gaming Performance Gain? @ Hardware Secrets
Looks like they too suffer
Looks like they too suffer from PCPerspective GameWorks testing suite syndrome.
Yes there will be more
Yes there will be more middleware wars for gaming, but AMD appears to be open source-ing more of their middleware! Those games middleware dependencies, and benchmarking, suites need to be monitored least things become very Antutu-ed!
That is not a surprise or
That is not a surprise or unexpected! Hardocp has been promoting Gameworks related games for the last couple years. Just look at any video card review they’ve done and subsequent comments.
“Friday was a big day for
“Friday was a big day for AMD’s open-source team as beyond publishing experimental Southern Islands / GCN 1.0 support for AMDGPU they also published for the first time open-source OverDrive overclocking support for the AMDGPU DRM kernel driver.”(1)
(1)
“An Ubuntu/Debian Kernel To Play With AMDGPU’s OverDrive Overclocking Support”
https://www.phoronix.com/scan.php?page=news_item&px=AMDGPU-OverDrive-Kernel
Didn’t get that email, damn
Didn't get that email, damn you Phoronix as I would link to that!
“First US p e n i s
“First US p e n i s transplant successfully carried out on Massachusetts man” (1)
Now all those flagship FanBoy GITs(Red and Green) with really small ones will not need those big cards to make up for their deflated ego/little wanker issues!
(1) 5/16/2015 [filter does not like P word]
arstechnica.com
edit: (1) 5/16/2015 [filter
edit: (1) 5/16/2015 [filter does not like P word]
to: (1) 5/16/2016 [filter does not like P word]
Damn! what year is it!
Seems about even in
Seems about even in performance, but it’s not a great idea to bet on good SLI and Crossfire support in many current and upcoming games. Things may get better with DX12 explicit multi-adapter, but games supporting that in the mainstream are a long ways off. A single card is the way to go with any budget under $700.
HardOCP, showing NV favored
HardOCP, showing NV favored games since 2000
Hahahaha FUCK SHITTY JANKY
Hahahaha FUCK SHITTY JANKY NIGGER RIGGED GHETTO POOR PEOPLE PARTS from AMD
You are one of those flagship
You are one of those flagship FanBoy(Green Team) GITs with a really small one that needs those big cards to make up for their deflated ego/little wanker issues!
Looks like someone’s crabby
Looks like someone’s crabby because their preferred brand lost again. Poor little racist crybaby.
Time to block this guy’s ip.
Time to block this guy’s ip. He comments on every AMD related article. He makes inflammatory comments about the authors and anyone who would consider AMD products. And he isn’t even original. He just repeats poor/peasant. Janky? Amd gpus are designed in Canada btw.
I’ve had PCPer on my AdBlock
I’ve had PCPer on my AdBlock whitelist for a couple years now. I think it might get turned back on until they do something about that guy.
I don’t understand why these
I don’t understand why these people change settings on certain games, I can understand turning off ASync for Nvidia or GameWorks features for AMD, but comparing High settings on one card to ultra to another card?
Man that is fucking stupid.
Even so the R9 380X CF still does much better than the 960 SLI – which is not surprising as even the R9 380 is better than a GTX 960.
You have to approach that
You have to approach that site’s benchmarks with a few understandings in mind:
1. You pretty much have to take their results in concert with other site’s results. Once you understand [H]’s methodology and purpose, their results can stand on their own, but when viewed through the same lens as one views another site’s benchmarks, they don’t make sense.
2. For what [H] is looking for, they don’t necessarily want to use the same settings – unless it just turns out that way. In this, [H]’s benchmarks (not including the “Apples-to-Apples” sections) are a bit more objective than others. [H] is looking for the “highest playable settings”, which in many instances would do a lot to highlight particular advantages the subjects might respectively have. For example, take a look at the Fallout 4 benchmarks, which gives a prime example of what you’re questioning, as well as a prime example of why they did it. The 380X-CF is running about 17-19 FPS faster than the 960-SLI, but they had to turn down Godrays and AO to do it. That’s what they, [H], determined was the “highest playable settings”, which means that turning anything up higher meant dropping the framerate to an unplayable rate. The 960-SLI had no trouble running at a playable framerate with Godrays and AO maxed out, even if its average was lower. This shows two things: one, that the 960-SLI has a big advantage with Godrays and AO, and two, that those technologies put a beating down on the 380X-CF. With a normal equal-settings benchmark, using the settings that were the highest playable settings on the 960-SLI, all you would see is the 380X-CF getting thrashed. Compare it to [H]’s results and you can see WHY the 380X-CF got thrashed.
3. The “Apples-to-Apples” sections are there to be that correlation, as well. They don’t care about playability, there. They care about working the cards as hard as they can with the highest settings possible, playable or not. This is the equal-settings comparison. If the “highest playable settings” highlights advantages, the “Apples-to-Apples” highlights weaknesses.
Don’t take [H]’s results all on their own, but don’t write them off either. Keeping their intent in mind, compare their results to other sites and the resolutions and settings they used, and take it all as a whole.
All Cards should be stress
All Cards should be stress tested like this, to point out any weaknesses, with a note to the readers that this testing is for the hardware’s stress testing only, and that most games/game settings can be adjusted for playability. Also any card that can work, with any group of settings maxed out, at a playable frame rate while the other card can not deserves to be praised!
Now over the next year for new Nvidia and AMD cards, and also for their cards from the previous 2 years, testers should go back and test with DX12, and Vulkan based games(for both windows OS and Linux OS based games)! This is just to see which new cards perform better under DX12/Vulkan, and which older cards, if they can even use/be made to use newer DX12/Vulkan APIs by any software/driver/other tweaks, perform better under DX12/Vulkan(provided there is/are any in software/driver/firmware tweaks to the older cards to allow for DX12/Vulkan on the cards going back 2 years).
So let’s see who had the older cards with the best async-compute ability from the past 2 years, and who has the new cards with the best async-compute ability over the current year following the release of both the first Pascal cards(already here), and the first Polaris SKUs! The first Polaris cards should be here in about a few weeks.
Over the next year there should be both Vega, and Volta SKUs to compare also.
Dose anyone edit these
Dose anyone edit these articles? The first two sentences make no sense, whatsoever.
I get testing a card(s) to
I get testing a card(s) to its maximum capability, but can anyone really tell the difference between High and Ultra settings while playing?
LOL…
Not usually…
In
LOL…
Not usually…
In blind tests without looking at the graphic options I’d be curious if even one person would be able to tell the difference beyond guessing.