A New Frontier
Taking a look at the performance of the current console generations
Console game performance has always been an area that we've been interested in here at PC Perspective but has been mostly out of our reach to evaluate with any kind of scientific tilt. Our Frame Rating methodology for PC-based game analysis relies on having an overlay application during screen capture which is later analyzed by a series of scripts. Obviously, we can not take this approach with consoles as we cannot install our own code on the consoles to run that overlay.
A few other publications such as Eurogamer with their Digital Foundry subsite have done fantastic work developing their internal toolsets for evaluating console games, but this type of technology has mostly remained out of reach of the everyman.
Recently, we came across an open source project which aims to address this. Trdrop is an open source software built upon OpenCV, a stalwart library in the world of computer vision. Using OpenCV, trdrop can analyze the frames of ordinary gameplay (without an overlay), detecting if there are differences between two frames, looking for dropped frames and tears to come up with a real-time frame rate.
This means that trdrop can analyze gameplay footage from any source, be it console, PC, or anything in-between from which you can get a direct video capture feed. Now that PC capture cards capable of 1080p60, and even 4K60p are coming down in price, software like this is allowing more gamers to peek at the performance of their games, which we think is always a good thing.
It's worth noting that trdrop is still listed as "alpha" software on it's GitHub repo, but we have found the software to be very stable and flexible in the current iteration.
Xbox One S | Xbox One X | PS4 | PS4 Pro | |
---|---|---|---|---|
CPU | 8x Jaguar 1.75 Ghz |
8x Jaguar 2.3 Ghz |
8x Jaguar 1.6 Ghz |
8x Jaguar 2.1 Ghz |
GPU CU | 12x GCN 914 Mhz |
40x Custom 1172 Mhz |
18x GCN 800 Mhz |
36x GCN 911 Mhz |
GPU Compute |
1.4 TF | 6.0 TF | 1.84 TF | 4.2 TF |
Memory | 8 GB DDR3 32MB ESRAM |
12 GB GDDR5 | 8 GB GDDR5 | 8 GB GDDR5 |
Memory Bandwidth |
219GB/s | 326GB/s | 176GB/s | 218GB/s |
Now that the Xbox One X is out, we figured it would be a good time to take a look at the current generation of consoles and their performance in a few games as a way to get our feet wet with this new software and method. We are only testing 1080p here, but we now have our hands on a 4K HDMI capture card capable of 60Hz for some future testing! (More on that soon.)
This initial article will mostly focus on the original PS4 vs. the Xbox One S, and the PS4 Pro versus the Xbox One X in three recent titles—Assassin's Creed Origins, Wolfenstein II, and Hitman.
In our time with Assassin's Creed Origins, we noticed that the cutscenes are some of the most difficult scenes for the consoles to render.
During the opening cinematic of the game, the base PS4 was able to achieve a mostly solid 30FPS, while the Xbox One S struggled to hit the 30FPS mark at times.
With the Xbox One X and PS4 Pro, both consoles were able to render at a full 30 FPS in this same section.
Since Assassin's Creed Origins implements dynamic resolution, it becomes more difficult to make a direct comparison of the GPU power of these four consoles from these results, but from a gameplay smoothness perspective, it's clear that the original Xbox One S falls behind the other consoles on a 1080p TV.
On the PC side, Wolfenstein II has been hailed as a highly optimized title (just as Doom was before it), building upon the work that id Software put into the id Tech 6 engine for Doom last year. On consoles, we see a similar story.
Even on the base-model consoles, we were able to hit a stable 60 FPS at 1080p.
The latest Hitman game is an excellent example of a console title that easily enables console performance testing. With the option for an "unlocked" frame rate, as well as optimized patches for both PS4 Pro and Xbox One X, we can get a good look at console-to-console performance.
With the frame rate in unlocked mode, the PS4 and Xbox One S both max out at about 45 FPS. However, the PS4 is 10-15% faster than the Xbox in the scenes we tested.
When we move to the PS4 Pro vs. the Xbox One X however, this story changes. The Xbox One X version of Hitman adds a setting allowing users to choose between High-Quality or High Framerate.
The High-Quality option renders the game at a native 2160p targeting 30 FPS, downsampled to our 1080p display, while the High Framerate option renders at a resolution of 1440p (the same as the PS4 Pro) and targets a frame rate of 60 FPS.
In our testing, we found that the Xbox One X does indeed hit these frame rate targets. Comparing the High framerate option to the PS4 Pro, we see the Xbox render at a solid 60 FPS, while the PS4 Pro hovers more in the 45 FPS region, a big difference between the two consoles.
Overall, it's still far too early to make any definitive performance statements about the Xbox One X and PS4 Pro. While it's clear from the specs that the Xbox One X is more powerful on paper, it's yet to be determined if developers will genuinely take advantage of the available horsepower yet.
Users do get the added benefit of all games on the Xbox One X being able to take advantage of more GPU horsepower as opposed to the PS4 Pro. While Sony has recently implemented "Boost Mode" in the PS4 Pro to attempt to remedy this, it's riddled with compatibility issues and seems to be mostly a dud. (This could change if Sony puts resources towards that goal, of course.)
We are eager for feedback on this new console testing and would love to hear what our readers are looking for in potential future testing. Personally I am eager to try to compare these new consoles a similarly priced PC in the same titles, although there's a challenge presented there in trying to get the same image quality settings across all platforms.
This is one of the big
This is one of the big reasons I don’t like console games. The frame rate is just so bad. I know this isn’t a popular opinion but I thought golden eye on N64 was garbage because the game had such bad graphics that you couldn’t see anything and such low frame rates that even if you did, doing something about it was really hard.
Having no mouse support doesn’t help either. FPS games on console are just so bad for me.
I feel exactly the same when
I feel exactly the same when it comes to early 3D games.
I strongly disagree on playing with mouse. I am not a cat 😉
Microsoft is allowing
Microsoft is allowing keyboard and mouse support on the Xbox One but I believe it is up to the developers to implement it. I didn’t know 60FPS was soo bad. Sounds more like an elitist pro gamer who’s only concern is performance. Many people still play games for fun, a balance of visuals, performance, and gameplay all matter to most casual gamers. This article was about performance of consoles. Why even read the article or comment if you knew the outcome before reading the article and has nothing to do with PC gaming. This article was bad for you because it had nothing to do with you.
The background gets blurry at
The background gets blurry at 30 fps. At 60 fps you can read things as you spin around…everything is more clear less motion blur.
Yes, this is very interesting
Yes, this is very interesting topic to discuss and I have highly interested in the finding you all find compared to DF.
DF is full of nonsense,
DF is full of nonsense, comparing the XBoxOne X to a gtx 1070… seriously.
I’d Buy an XBONE-X right now
I’d Buy an XBONE-X right now if I could hook up a keyboard and a mouse and Run Blender 3d on it. Currently I think that the XBONE-X can only run UWP apps but not win32 applications. Intel’s NUCs with that EMIB/MCM and Radeon semi-custom discrete GPU die is going to Eat the XBONE-X’s lunch if Microsoft does not turn the XBONE-X into a Mini-PC like platform.
Intel’s NUC and that EMIB/MCM Intel/Radeon GPU mash-up is going to be popular for a while until the Desktop Raven Ridge APU Offerings come fully online in some Mini-PC form factor offerings with maybe options for Radeon discrete GPUs for dual Explicit Indegrated/Discrete Mult-adaptor DX12/Vulkan aware games.
Such a NUC is going to be
Such a NUC is going to be very expensive for the amount of performance it will deliver. You will be paying a lot just for a small form factor. If you don’t care that much about size, a regular desktop GPU with HBM will be a lot better.
AMD needs to create its own
AMD needs to create its own “NUC” style refrence design and make use of its desktop Raven Ridge APUs inside a mini desktop design. And that could compete with Intel’s NUC.
What I really want is a laptop with a desktop Raven Ridge APU inside and If ASUS can get a Ryzen 7 1700 in a laptop form factor at 65 Watts then getting a desktop Raven Ridge APU in a laptop at 45-65 Watts is possible also.
Anandtech also says that ASUS will be making a lower cost desktop Ryzen six-core Ryzen 5 1600 in a laptop SKU but I’m hoping that ASUS can get a Vega 11 discrete GPU based laptop offering next year that comes with the dasktop Ryzen 7/8 core or or Ryzen 5/6 core desktop CPUs.
“software like this is
“software like this is allowing more gamers to peak at the performance of their games”
Do you mean the software is allowing gamers to reach a peak in their gaming performance or get peak performance out of their hardware?
What he means is that by
What he means is that by using this software you can analyze(get a peak) of the frames etc. your pc is outputting.
So, like get only the very
So, like get only the very tops of frames?
Or just a typo of ‘peek’.
Or just a typo of ‘peek’.
Whoops… homonyms strike
Whoops… homonyms strike again! I did, in fact, mean "peek" instead of "peak" in this case. Thanks for pointing it out!
lol sub 30fps trash
lol sub 30fps trash #KillYourConsole #PCMR
I wonder if a 2DFFT module
I wonder if a 2DFFT module could be added to dynamically monitor rendering resolution per-frame? The use of post-AA makes it a bit less obvious then the super-sharp aliasing cutoff you’d otherwise see, but it should still be pretty clear where the HF rolloff lies which would tell you the rendering resolution.
It’s great to see PCPer
It’s great to see PCPer opening a new frontier.
However with consoles, actual performance is not the only part of the equation. Considering that PS4 has roughly 50% more powerful GPU than XB1 and XB1X has roughly 50% more powerful GPU than PS4Pro, and the games typically target specific framerate (usually 30 or 60 fps), which is frequently driven by CPU speed (where the differences between consoles are rather small) rather than GPU, the key differentiator is image quality.
Therefore, without image quality comparison, your analysis cannot be complete.
Now, if I look at Digital Foundry, there is always image quality comparison. But frankly, even on my 4K monitor, frequently I do not really see any obvious difference between the competing consoles, until they zoom it on some specific detail.
Therefore, it would be great if you could make some sort of unscientific, subjective, but still relevant testing in terms of: play the same game on two same screens (or swap inputs), XB1 vs. PS4 and XB1X vs. PS4Pro, shortly after, maybe even interchanging – can you subjectively percieve the difference in image quality and performance?
The memory bandwidth listed
The memory bandwidth listed for Xbox One S is incorrect. Should be listed as 68 GB/S.
Is it true that the Xbox one
Is it true that the Xbox one X has Freesync support? Are there any issues with that support?
The hardware is supposed to
The hardware is supposed to support Freesync, but I don't believe it has been enabled in software yet. We tried a few freesync displays on the Xbox One X, but found no evidence of it actually using the Freesync functionality
Thanks for the reply Ken.
Thanks for the reply Ken.