The Witcher 3: Wild Hunt – Competitive
The Witcher 3 (DirectX 11)
Played in a third-person perspective, players control protagonist Geralt of Rivia, a monster hunter known as a witcher, who sets out on a long journey through the Northern Kingdoms. In the game, players battle against the world's many dangers using swords and magic, while interacting with non-player characters and completing side quests and main missions to progress through the story. The game was met with critical acclaim and was a financial success, selling over 6 million copies in six weeks. The game won multiple Game of the Year awards from various gaming publications, critics, and game award shows, including the Golden Joystick Awards, The Game Awards, Game Developers Choice Awards, and SXSW Gaming Awards. –Wikipedia
Settings used for The Witcher 3
At both 1920×1080 and 2560×1440, The Witcher 3 is stressing our GPUs but the GeForce GTX 1070 does a good job separating from the pack. The new Pascal mid-range card is 37% faster than the GTX 980 and as much as 71% faster than the GTX 970. The AMD Radeon R9 Nano does well to keep up as much as it did, coming in only 22% slower at 2560×1440 at just over 50 FPS.
GeForce GTX 1070 8GB, Average FPS Comparisons, The Witcher 3 | |||||
---|---|---|---|---|---|
GTX 980 | GTX 970 | R9 Nano | R9 390X | ||
1920×1080 | +37% | +71% | +35% | +38% | |
2560×1440 | +37% | +58% | +22% | +37% |
This table presents the above data in a more basic way, focusing only on the average FPS, so keep that in mind.
Based on my observation of
Based on my observation of the GTX 970 and 980 releases, I have a feeling that the GTX 1070 will be the best value. And anyone who buys a GTX 1080 will regret it once the 1080 Ti’s releases. Personally I may end up getting just one 1080 just to try it out for gaming and folding@home, but I’m really eager to see what Nvidia brings to the table with the Titanium release.
The link on the “Testing
The link on the “Testing Suite and Methodology Update” page in this paragraph:
“For those of you that have never read about our Frame Rating capture-based performance analysis system, the following section is for you. If you have, feel free to jump straight into the benchmark action!!”
jumps to the 1080 review.
I was properly confused for a few seconds when I didn’t see any 1070 data on the page.
@Allyn: What would you think
@Allyn: What would you think about frame time weighted frame time percentile graphs? Like in the SSD reviews?
Just a joke, I don’t think it matters that much in this data since the variance is not multiple orders of magnitude here.
Ryan and I actually had this
Ryan and I actually had this conversation the other day. It could come into play with the percentile plots, but things would need to be presented a bit differently. It would help spread cards with greater variation out of the pack a bit more, but as it stands now, cards that misbehave tend to misbehave badly enough that we don't need to weigh it any differently to make it obvious.
These new power measurements
These new power measurements are amazing, thanks pcper for keeping on it, pushing measurement methods and supplying us with sensible data.
(however, I think the particular page mixes Hawaii, Fiji and Tahiti as others have also commented on)
“Testing suite” page:
>> As a
“Testing suite” page:
>> As a result, you’ll two sets of data in our benchmark pages
Word missing?
>> As a result, you’ll word
>> As a result, you’ll word missing two sets of data in our benchmark pages.
I know it’s already alot of
I know it’s already alot of work, but can we have some openCL or blender benchmarks? or even just from preimer pro testing
not all people game
and, well, also for the 1080 please! >.<
Using Chrome atm. When I
Using Chrome atm. When I click on a picture, the pictures tend to look a bit weird. Like, with the power graph when I click on it, the picture isn’t centered on the page. When I click on the bar graph, the picture is super large.
Would anyone be able to say
Would anyone be able to say if one could pair this GPU with a 980ti since they are comparable in performance and are pretty much the same architecture?
Unlikely nVidia would let you
Unlikely nVidia would let you do it. Might work in something like Ashes of the Singularity but betting other developers will do a similar version of multi-card rendering doesn’t seem like a sound plan.
Why single out power used by
Why single out power used by graphics card alone?
As long as GPUs need driver executed by CPU it does nor make sense to me.
Great review yet again Ryan.
Great review yet again Ryan. Just a heads up, the link to the benchmarks on page 3 sends one to the 1080 page.
Are the other cards used in
Are the other cards used in the comparison overclocked?
Why does this site still use
Why does this site still use the stupid tiny lines? Why can’t you just put the damn FPS numbers down and be done with it! I hate looking at very tiny lines just to get a idea of performance! This is a huge reason why I stopped coming to this site for reviews!
Ryan.. would you agree that
Ryan.. would you agree that nVidia probably made the 970 too good of a deal for what you got? As it seems there is more differences between the 1070 vs 1080 this time around.
If nVidia could change history, they probably would have either made the 970 not as fast or more expensive.
@Ryan Shrout, can u do
@Ryan Shrout, can u do another review regarding MICRON & SAMSUNG VRAM for GTX 1070 again?
there’s some fiasco like previous GTX 970 3.5GB VRAM & guess what now is bout the brand.
obviously, every reviewers cherry picked with SAMSUNG chip & how come there’s no MICRON chip for review??? thanks.