The Witcher 3
The Witcher 3 (DirectX 11)
Played in a third-person perspective, players control protagonist Geralt of Rivia, a monster hunter known as a witcher, who sets out on a long journey through the Northern Kingdoms. In the game, players battle against the world's many dangers using swords and magic, while interacting with non-player characters and completing side quests and main missions to progress through the story. The game was met with critical acclaim and was a financial success, selling over 6 million copies in six weeks. The game won multiple Game of the Year awards from various gaming publications, critics, and game award shows, including the Golden Joystick Awards, The Game Awards, Game Developers Choice Awards, and SXSW Gaming Awards. –Wikipedia
Settings used for The Witcher 3
In Witcher 3, the Radeon Vega FE has nearly identical performance to the GeForce GTX 1070 and is 17% slower than the GTX 1080 when running at 2560×1440. At 4K that gap expands slightly to 20%, in favor of the GTX 1080 once again.
Radeon Vega Frontier Edition 16GB (300W), Average FPS Comparisons, The Witcher 3 | |||||
---|---|---|---|---|---|
GTX 1080 Ti | GTX 1080 | GTX 1070 | R9 Fury X | ||
2560×1440 | -38% | -17% | +1% | +16% | |
3840×2160 | -40% | -17% |
This table presents the above data in a more basic way, focusing only on the average FPS, so keep that in mind.
Adding this stupid, manually
Adding this stupid, manually switchable gaming mode was a terrible, TERRIBLE marketing idea. This will take a big fat chunk out of the RX sales.
…oh yes, and great review
…oh yes, and great review as usual…ahem. 🙁
Terrible? It costs much more
Terrible? It costs much more than RX. So it wouldn’t be able to affect RX sales.
Well, good thing that you
Well, good thing that you didn’t make it your final review, however..
Vega performance rather is fishy more than anything.
It barely performs better than a Fury X with drastic increases in clocks as the Fury X would outright beat that with such clocks and is an older generation with older computing even if it was to just be a die shrink, those NCU’s are not even showing their prowess, we heard many times that those are probably Fiji drivers and that TBR is disabled therefor VERY bad to judge from as of yet.
I would take all of this as a grain of salt, Polaris to Hawaii shown it’s improvements by having less SU’s and half less the ROP’s with slightly higher clock speeds as we still have much less rendering yet better performance.
The power figures are actually pretty good considering how highly clocked it is compared to the old Fury X and still consume nearly the same if not slightly more than it.
It’s pretty much like they are sand bagging at this point, or it would look like they are because people are reviewing an incomplete piece of technology and ruining it’s marketing for it, there’s a reason why AMD didn’t send samples to anyone and despite not doing it, because reviewers are doing it like this, it might have killed some people interest into buying it later because the majority are pretty ignorant in the first place.
Final thoughts on the matter though, I’m sure they shipped Vega as it is like this because optimizing productive software is much less of a hassle compared games thus, they started to want making profits earlier, from the results of certain software we can see that Vega performs extremely well in some and lose in others, to be quite fair against it’s competitor, it looks like a very compelling card and we’ve just got to have benchmarks in a very early stage that we shouldn’t have.
Please, just don’t think Vega is a flop yet with all those factors in place.
Good write up Ryan
well done
Good write up Ryan
well done on putting out some clarification points to the obvious questions that always arise with these new products. This is a beast of a GPU with all sorts of new hardware goodness in there from the materials used right down to the finished product it oozes quality and quantity. I can see this card retailing for around $1500 in Australia.
Nice any chance you can run
Nice any chance you can run the DeepBench that AMD showed off before.
Why does the chart wrongly
Why does the chart wrongly show the 1080Ti as not having the same G5X interface as Titan Xp and 1080 vanilla? Mistake.
To me, the most important
To me, the most important question is, is this card going to support 10bit output for openGL, for work in Photoshop. Because if I have a nice 10bit monitor (like BenQ SW320) I would like to use 10bit color in Photoshop. But Photoshop usually requires you to have Quadro or FuryPro to use 10 bit color. What about this card?
Thanks!
Very disappointing.
Expensive
Very disappointing.
Expensive and slow?
I have just registered to
I have just registered to tell you how much I liked your review.
I simply can’t believe Vega is just a die-shrink of Fiji. Which would be completely odd, because they could have at least used Polaris as a basis. My bet is also on a lot of features not turned on yet. I have a suggestion to make, to see if that assumption is true: could you test Deus Ex: Mankind Divided on both Vega and a Fiji card, possibly even at the same clocks? We have seen a demonstration of the HBCC in that game and if the performance level would be the same on both cards are similar clocks, it would appear to me a driver issue.
Would you be able to test
Would you be able to test Prey @ 4k? Raja said 2 RX Vega were running above 60 FPS. Curious to see how well FEs run compared to RX Vega.
jul 2016
AMD puts massive
jul 2016
AMD puts massive SSDs on GPUs and calls it SSG
https://semiaccurate.com/2016/07/25/amd-puts-massive-ssds-gpus-calls-ssg/
Ryan – Can you ask if these
Ryan – Can you ask if these will work for VDI on Server 2016? (or for that matter any of AMD’s S series cards) and where to get drivers for Server 2016?
AMD doesn’t seem to offer drivers for Server 2016, but they advertise their VDI cards as working with Windows Server. I would like to know if it works for Remote FX or Descrete Device Assignment for GPU’s and what kind of performance one can expect from that configuration.
Thanks.
Many thanks PC Perspective
Many thanks PC Perspective for putting this awesome bench marking piece together!
One thing I’m really wondering about is the cards performance in Folding@home. As a heavy folding@home donor I am in need of upgrading my build and am seriously considering Vega as my upgrade path. If you do a follow-up piece any chance you could throw in some Folding@home benchmarking?
If anyone can comment on
If anyone can comment on Folding@home performance that would be appreciated.
Depending on the workload, as
Depending on the workload, as an eGPU, which shouldn’t make much difference since F@H is such low bandwidth, I am getting 65k to 71k PPD. Am I disappointed? Yes. Is it way better than my W7000 I was using before, for sure. Luckily I don’t use this card for a F@H server farm. If I were pulling 100k PPD or close to it, I would be more incentivize to leave my computer on churning out folds 24/7.
Can’t believe you’re
Can’t believe you’re defending AMD for calling it “not a gaming card”. That’s a load of BS. And the Titans are not gaming cards! They are for deep learning. How much is AMD paying you to write this propaganda?
make a test undet DX 12….
make a test undet DX 12…. and not DX 11. newbies.
and you will see how nvidia sucks against VEGA