Answering Questions Before you Ask
I know there are going to be (and were, during our live stream) questions about this card and our testing methods and process. During the live stream, I saw many of these questions posed to us with a critical eye. I felt rather than just awaiting for them in our comments on the story today, I would try to answer some these directly FIRST, and then dive into the review itself.
This isn’t a gaming card.
The Radeon Vega Frontier Edition is billed a card for creators that game, and gamers that create. Or rather, the Radeon Pro Duo was. But the audience segment matches up with the language AMD has been using for FE. There is a fine line between the high end of the gaming market and the professional market made up of Quadro and Radeon Pro (previously FirePro). NVIDIA has skirted that with the Titan product family for some time, but the Frontier Edition of Vega is the first time we are seeing that from AMD.
Calling this “not a gaming card” is a fair statement, as long as you also agree then the GTX Titan, Titan X, Titan Xp are also “not gaming cards.” But they are, despite NVIDIA segmenting it off as well. Plenty of professionals will buy this hardware, but discerning gamers that want the best of the best will also be purchasing Titans and FEs well into the future. “Professional graphics cards” have certified drivers and specific code paths in place for applications like 3ds Max, Maya, etc. Neither Titan nor Vega FE have that and instead will depend on the driver stacks we are used to seeing in GeForce and Radeon systems.
Even if it weren’t a gaming card, we have every right to test it with gaming applications! Is a professional developer, gamer or not, going to buy a $4000 Quadro and THEN a GTX 1080 Ti to game on? Nope, they are going to get double duty of that card.
Once the Radeon RX Vega graphics card hits the market, then the Vega Frontier Edition can exist in its vacuum where it only addresses the professional market. Until then, the Vega FE will be the pseudo-representative of how the rest of us are expecting the consumer gaming card variants to perform.
The drivers are old.
There were concerns over the last couple of days that the driver for Vega FE on the site was from January of this year. As it turns out, reading version numbers of the Radeon driver package is difficult, and the driver we used in our testing, and the ONLY driver that supports Vega FE today, is not old. To be clear though, the driver IS from a different branch than the currently available Radeon RX 500-series driver, but the exact time of that branch, and how it affects performance on games or other workloads, isn’t information AMD is interested in sharing at this time.
The driver isn’t optimized for gaming.
I saw this pop up a lot during our stream yesterday, that the driver isn’t meant for gaming so it hasn’t been optimized for gaming. Instead, it’s only targeting “professional” level applications. First, that’s not the case and AMD has confirmed that. The driver has all the gaming optimizations that the other Radeon drivers would include up until at least the driver branching mentioned above. After that time, optimizations may or may not have made it in, as AMD tells it.
The games we are using for this review were not released in the last 30 days or anything like that. GTA V, Rise of the Tomb Raider, Witcher 3; these are all games that have been out for some time, were around for AMD to address in both Radeon RX 500 and Vega-series drivers for many, many months.
The one caveat to this is that the Vega architecture itself is still unoptimized in the driver. But this would affect gaming and non-gaming workloads most of the time. So if the driver isn’t ready for ANYTHING, then that’s a valid concern, but it applies to ALL workloads and not just games.
You didn’t test it overclocked.
We ran all of the games and professional software testing with the out-of-box settings and performance characteristics. I did some light overclocking as well, but not for the benchmarking as a whole. This is standard practice for us at PC Perspective and I think it’s the right decision for ANY product launch. Despite what those of us willing to watch a live stream of a graphics card may think, the vast majority of users are not tweaking cards and overclocking hardware. They just want it to work.
It didn’t hit the 1600 MHz it should have.
The rated clock speed of the Radeon Vega Frontier Edition is 1600 MHz, but in our testing the GPU never really got to that, settling in the 1440 MHz range the majority of the time. Again, that is part of the “out of box” experience that we want to test and know about. If the cooler or fan curves on this card weren’t tuned to get the full clock speed out of the hardware on day one, then that’s the experience someone shelling out $1000 of their own money would get.
You didn’t turn up the fan speed.
Not for basic testing, I did not. But when we saw some thermal throttling we did jump that fan speed up to 100% (holy noise levels Batman!) as a test to see how the behavior changed. Based on that testing I do believe that AMD should be willing to sacrifice some noise increasing for a higher fan speed, cooler GPU and less thermal throttling on the power draw.
You probably didn’t have it in game mode!
Actually, I did. And, to make matters worse for that point of view, AMD has confirmed that switching between Game Mode and Professional Mode will have no performance impact, only visual and UI elements (of the ReLive driver settings GUI *only* – not your games) will change.
The RX Vega is going to be clocked higher.
Perhaps it will, and if so, we’ll do more testing on that hardware to see how the changes affect gaming performance. As I mentioned above, the Vega FE will only “represent” the expectations for the Vega gaming hardware until the gaming hardware hits the scene. Then it has the ability to surprise us (or not) with added performance and capabilities we did not see today with the Frontier Edition.
The RX Vega is going to use a better cooler.
Again, perhaps it will. And if my Vega FE with the integrated water cooler arrives in the near term future, we’ll know how much thermal throttling can be avoided, how that added 75 watts power draw improves performance, and more. For now, we are working with the hardware we have available to us.
Adding this stupid, manually
Adding this stupid, manually switchable gaming mode was a terrible, TERRIBLE marketing idea. This will take a big fat chunk out of the RX sales.
…oh yes, and great review
…oh yes, and great review as usual…ahem. 🙁
Terrible? It costs much more
Terrible? It costs much more than RX. So it wouldn’t be able to affect RX sales.
Well, good thing that you
Well, good thing that you didn’t make it your final review, however..
Vega performance rather is fishy more than anything.
It barely performs better than a Fury X with drastic increases in clocks as the Fury X would outright beat that with such clocks and is an older generation with older computing even if it was to just be a die shrink, those NCU’s are not even showing their prowess, we heard many times that those are probably Fiji drivers and that TBR is disabled therefor VERY bad to judge from as of yet.
I would take all of this as a grain of salt, Polaris to Hawaii shown it’s improvements by having less SU’s and half less the ROP’s with slightly higher clock speeds as we still have much less rendering yet better performance.
The power figures are actually pretty good considering how highly clocked it is compared to the old Fury X and still consume nearly the same if not slightly more than it.
It’s pretty much like they are sand bagging at this point, or it would look like they are because people are reviewing an incomplete piece of technology and ruining it’s marketing for it, there’s a reason why AMD didn’t send samples to anyone and despite not doing it, because reviewers are doing it like this, it might have killed some people interest into buying it later because the majority are pretty ignorant in the first place.
Final thoughts on the matter though, I’m sure they shipped Vega as it is like this because optimizing productive software is much less of a hassle compared games thus, they started to want making profits earlier, from the results of certain software we can see that Vega performs extremely well in some and lose in others, to be quite fair against it’s competitor, it looks like a very compelling card and we’ve just got to have benchmarks in a very early stage that we shouldn’t have.
Please, just don’t think Vega is a flop yet with all those factors in place.
Good write up Ryan
well done
Good write up Ryan
well done on putting out some clarification points to the obvious questions that always arise with these new products. This is a beast of a GPU with all sorts of new hardware goodness in there from the materials used right down to the finished product it oozes quality and quantity. I can see this card retailing for around $1500 in Australia.
Nice any chance you can run
Nice any chance you can run the DeepBench that AMD showed off before.
Why does the chart wrongly
Why does the chart wrongly show the 1080Ti as not having the same G5X interface as Titan Xp and 1080 vanilla? Mistake.
To me, the most important
To me, the most important question is, is this card going to support 10bit output for openGL, for work in Photoshop. Because if I have a nice 10bit monitor (like BenQ SW320) I would like to use 10bit color in Photoshop. But Photoshop usually requires you to have Quadro or FuryPro to use 10 bit color. What about this card?
Thanks!
Very disappointing.
Expensive
Very disappointing.
Expensive and slow?
I have just registered to
I have just registered to tell you how much I liked your review.
I simply can’t believe Vega is just a die-shrink of Fiji. Which would be completely odd, because they could have at least used Polaris as a basis. My bet is also on a lot of features not turned on yet. I have a suggestion to make, to see if that assumption is true: could you test Deus Ex: Mankind Divided on both Vega and a Fiji card, possibly even at the same clocks? We have seen a demonstration of the HBCC in that game and if the performance level would be the same on both cards are similar clocks, it would appear to me a driver issue.
Would you be able to test
Would you be able to test Prey @ 4k? Raja said 2 RX Vega were running above 60 FPS. Curious to see how well FEs run compared to RX Vega.
jul 2016
AMD puts massive
jul 2016
AMD puts massive SSDs on GPUs and calls it SSG
https://semiaccurate.com/2016/07/25/amd-puts-massive-ssds-gpus-calls-ssg/
Ryan – Can you ask if these
Ryan – Can you ask if these will work for VDI on Server 2016? (or for that matter any of AMD’s S series cards) and where to get drivers for Server 2016?
AMD doesn’t seem to offer drivers for Server 2016, but they advertise their VDI cards as working with Windows Server. I would like to know if it works for Remote FX or Descrete Device Assignment for GPU’s and what kind of performance one can expect from that configuration.
Thanks.
Many thanks PC Perspective
Many thanks PC Perspective for putting this awesome bench marking piece together!
One thing I’m really wondering about is the cards performance in Folding@home. As a heavy folding@home donor I am in need of upgrading my build and am seriously considering Vega as my upgrade path. If you do a follow-up piece any chance you could throw in some Folding@home benchmarking?
If anyone can comment on
If anyone can comment on Folding@home performance that would be appreciated.
Depending on the workload, as
Depending on the workload, as an eGPU, which shouldn’t make much difference since F@H is such low bandwidth, I am getting 65k to 71k PPD. Am I disappointed? Yes. Is it way better than my W7000 I was using before, for sure. Luckily I don’t use this card for a F@H server farm. If I were pulling 100k PPD or close to it, I would be more incentivize to leave my computer on churning out folds 24/7.
Can’t believe you’re
Can’t believe you’re defending AMD for calling it “not a gaming card”. That’s a load of BS. And the Titans are not gaming cards! They are for deep learning. How much is AMD paying you to write this propaganda?
make a test undet DX 12….
make a test undet DX 12…. and not DX 11. newbies.
and you will see how nvidia sucks against VEGA