A couple of weeks back a developer on Ubisoft's Assassin's Creed Unity was quoted that the team had decided to run both the Xbox One and the Playstation 4 variants of the game at 1600×900 resolution "to avoid all the debates and stuff." Of course, the Internet exploded in a collection of theories about why that would be the case: were they paid off by Microsoft?
For those of us that focus more on the world of PC gaming, however, the following week an email into the Giantbomb.com weekly podcast from an anonymous (but seemingly reliable) developer on the Unity team raised even more interesting material. In this email, despite addressing other issues on the value of pixel count and the stunning visuals of the game, the developer asserted that we may have already peaked on the graphical compute capability of these two new gaming consoles. Here is a portion of the information:
The PS4 couldn’t do 1080p 30fps for our game, whatever people, or Sony and Microsoft say. …With all the concessions from Microsoft, backing out of CPU reservations not once, but twice, you’re looking at about a 1-2 FPS difference between the two consoles.
What's hard is not getting the game to render but getting everything else in the game at the same level of performance we designed from the start for the graphics. By the amount of content and NPCs in the game, from someone who witnessed a lot of optimizations for games from Ubisoft in the past, this is crazily optimized for such a young generation of consoles. This is really about to define the next-generation unlike any other game beforehand.
We are bound from the CPU because of AI. Around 50% of the CPU is used for the pre-packaged rendering parts..
So, if we take this anonymous developers information as true, and this whole story is based on that assumption, then have learned some interesting things.
- The PS4, the more graphically powerful of the two very similarly designed consoles, was not able to maintain a 30 FPS target when rendering at 1920×1080 resolution with Assassin's Creed Unity.
- The Xbox One (after giving developers access to more compute cycles previously reserved to Kinect) is within a 1-2 FPS mark of the PS4.
- The Ubisoft team see Unity as being "crazily optimized" for the architecture and consoles even as we just now approach the 1 year anniversary of their release.
- Half of the CPU compute time is being used to help the rendering engine by unpacking pre-baked lighting models for the global illumination implementation and thus the game is being limited by the 50% remaining performance power the AI, etc.
It would appear that just as many in the media declared when the specifications for the new consoles were announced, the hardware inside the Playstation 4 and Xbox One undershoots the needs of game developers to truly build "next-generation" games. If, as this developer states, we are less than a year into the life cycle of hardware that was planned for an 8-10 year window and we have reached performance limits, that's a bad sign for game developers that really want to create exciting gaming worlds. Keep in mind that this time around the hardware isn't custom built cores or using a Cell architecture – we are talking about very basic x86 cores and traditional GPU hardware that ALL software developers are intimately familiar with. It does not surprise me one bit that we have seen more advanced development teams hit peak performance.
If the PS4, the slightly more powerful console of the pair, is unable to render reliably at 1080p with a 30 FPS target, then unless the Ubisoft team are completely off the rocker in terms of development capability, the advancement of gaming on consoles would appear to be somewhat limited. Remember the specifications for these two consoles:
PlayStation 4 | Xbox One | |
---|---|---|
Processor | 8-core Jaguar APU | 8-core Jaguar APU |
Motherboard | Custom | Custom |
Memory | 8GB GDDR5 | 8GB DDR3 |
Graphics Card | 1152 Stream Unit APU | 768 Stream Unit APU |
Peak Compute | 1,840 GFLOPS | 1,310 GFLOPS |
The custom built parts from AMD both feature an 8-core Jaguar x86 architecture and either 768 or 1152 stream processors. The Jaguar CPU cores aren't high performance parts: single-threaded performance of Jaguar is less than the Intel Silvermont/Bay Trail designs by as much as 25%. Bay Trail is powering lots of super low cost tablets today and even the $179 ECS LIVA palm-sized mini-PC we reviewed this week. And the 1152/768 stream processors in the GPU portion of the AMD APU provide some punch, but a Radeon HD 7790 (now called the R7 260X), released in March of 2013, provides more performance than the PS4 and the Radeon R7 250X is faster than what resides in the Xbox One.
If you were to ask me today what kind of performance would be required from AMD's current GPU lineup for a steady 1080p gaming experience on the PC, I would probably tell you the R9 280, a card you can buy today for around $180. From NVIDIA, I would likely pick a GTX 760 (around $200).
Also note that if the developer is using 50% of the CPU resources for rendering computation and the remaining 50% isn't able to hold up its duties on AI, etc., we likely have hit performance walls on the x86 cores as well.
Even if this developer quote is 100% correct that doesn't mean that the current generation of consoles is completely doomed. Microsoft has already stated that DirectX 12, focused on performance efficiency of current generation hardware, will be coming to the Xbox One and that could mean additional performance gains for developers. The PS4 will likely have access to OpenGL Next that is due in the future. And of course, it's also possible that this developer is just wrong and there is plenty of headroom left in the hardware for games to take advantage of.
But honestly, based on my experience with these GPU and CPU cores, I don't think that's the case. If you look at screenshots of Assassin's Creed Unity and then look at the minimum and recommended specifications for the game on the PC, there is huge, enormous discrepancy. Are the developers just writing lazy code and not truly optimizing for the hardware? It seems unlikely that a company the size of Ubisoft would choose this route on purpose, creating a console game that runs in a less-than-ideal state while also struggling on the PC version. Remember, there is almost no "porting" going on here: the Xbox One and Playstation 4 share the same architecture as the PC now.
Of course, we might just be treading through known waters. I know we are a bit biased, and so is our reader base, but I am curious: do you think MS and Sony have put themselves in a hole with their shortsighted hardware selections?
UPDATE: It would appear that a lot of readers and commentors take our editorial on the state of the PS4 and XB1 as a direct attack on AMD and its APU design. That isn't really the case - regardless of what vendors' hardware is inside the consoles, had Microsoft and Sony still targeted the same performance levels, we would be in the exact same situation. An Intel + NVIDIA hardware combination could just have easily been built to the same peak theoretical compute levels and would have hit the same performance wall just as quickly. MS and Sony could have prevented this by using higher performance hardware, selling the consoles at a loss out the gate and preparing each platform for the next 7-10 years properly. And again, the console manufacturers could have done that with higher end AMD hardware, Intel hardware or NVIDIA hardware. The state of the console performance war is truly hardware agnostic.
This underwhelming graphical
This underwhelming graphical performance would seriously undermine my ability to play an enjoyable game of Tetris on either of these consoles!
Now that I cleared the snark out of my system, I’ll leave my actual opinion, that maybe this will finally spur the growth of the PC as gaming platform.
Nice article, big question
Nice article, big question is, and this may put everything in context; What does one expect from a gaming console priced at $399-$499? I think Sony and Microsoft got what they asked for… Another thing to consider, and I don’t know what the answer is, is how would Ubisoft’s new Assassin’s Creed Unity game fare on the previous generation consoles? Is the new generation console leaps and bounds better? If the answer is yes, then the Buyers are getting their money’s worth. If they want 1080p at better than consistent 28 FPS then they need to invest in a PC with the appropriate CPU/GPU combination.
Optimization isn’t a Ubisoft
Optimization isn’t a Ubisoft strong point. AC franchise has never been optimize for multi core. 1 CPU and it utilizes that well 2 cores and its okay. 4 and it only utilizes 2 cores and other 2 for minor task. Any more than 4 its a waste.
They have to improve on the core engine and I don’t think that is something Ubisoft is willing to due. Spend money on game engine development and optimization when they can just blame it on something other then themselves like they always do.
Its the same ol’ same ol’. Just when there games don’t sell they run out to blame pirates.
http://www.hardwarepal.com/assassins-creed-4-black-flag-benchmark-cpu-gpu/8/
If you played earlier Assassin’s Creed sequels you will notice that CPU usage is more or less the same when it comes to CPU load. The first core on all of the CPU’s is working at around 70%, and the last core is at 60% of load most of the time, while the load on other cores or threads varies from time to time. The game does use all of the cores and threads but the engine is programmed in a way that nothing above a quad core CPU will give you better performance.
Maybe the’ll realize how many cores the X-Box One and PS4 have and utilize them and we PC users wont get the same old crappy Assasins Creed port we always get.
Who cares? The Assassin Creed
Who cares? The Assassin Creed series has been the most anti-Western Marxist-propaganda-soaked POS series of games ever produced. It’s a disgrace to see all those Western kids play that game.
We’re already flooded with games promoting the demographic and cultural genocide of the West.
It’s about time someone turns the tide and destroys this despicable suicidal “modern” culture along with the political and global powers that sustain and promote it for good.
I hear you sir.
I can’t
I hear you sir.
I can’t stand these goose liver commies. These rich people that are push marxism down our throats.
Is it possible that version 2
Is it possible that version 2 of both the xbone and ps4 will have a better apu?
No because that would cause
No because that would cause fragmentation
why would it? same
why would it? same architecture, just better apu.
Maybe now game developers
Maybe now game developers will actually start focusing on making the games more creative and engaging rather than just graphics. Nintendo and Valve are primary examples of what you can achieve even with seemingly ‘underpowered’ hardware.
Ubisoft seem to be getting
Ubisoft seem to be getting alot of flak for this sub 1080p 30fps dilemma with ps4 users saying their console can do it at 1080p it’s other systems bringing them down. Well the order is ps4 exclusive, corridor 3rd person shooter and from all reports it’s 800p @ 30fps.
I’ve got no idea why all these performance issues are showing up but if 1st party devs are struggling multiplatform devs must be in a world of hurt.
Console gaming LOL I did that
Console gaming LOL I did that for a while.
Actually was useful for sending my HDTV to another room via Cat6
Can the XB1 do this ???
Damn AMD card…every time it
Damn AMD card…every time it quits working I re-seat it in the MOBO and it is good for months…still haven’t figured that one out.
Peak Compute , Peak Google ,
Peak Compute , Peak Google , Peak Twitter
I take it global lighting
I take it global lighting will not be coming to a console any time soon ?
I never did understand why
I never did understand why they put such a crap CPU in both consoles! It had to be none other then to save money and give people crap hardware wrapped in a fancy package.
I don’t mind paying for decent hardware. I guest people are cheap judging by how people freaked out over ps3 price when it first came out! (they was selling at a lost for god sakes, you cheap console people!)
What am getting at is i don’t know if we can place 100% of blame on console makers or public who wont buy anything but pure cheap crap, if there forced to pay for great hardware they whine…..
They wanted cheap, So they all got a tablet cpu powering there games (mwhahahaha)
Even a half decent CPU would of lasted them many years.
Take my haswell i5 , this thing will playing games just fine for 6+ years easy, plus any games of the (far) future will be DX12 and just like mantle does for cpu side, it let my gpu handle the task with less demands on my cpu to deliver.
My new gtx 970 will be happy to handle the work.
Plus on cpu side of things consoles have such low FOV ( CPU is often limited by the number of things that need to be rendered) Console devs use many tricks to get there games running.
———–
Hell even Nvidia senior vice president was predicting these very crap hardware issues way before the consoles even hit the market…
http://www.techradar.com/us/news/gaming/consoles/nvidia-compares-ps4-specs-to-a-low-end-cpu-1138051
I love how he just came out and said it “we came to the conclusion that we didn’t want to do the business at the price those guys were willing to pay”
I count that as proof alone MS/sony did not want to pay for decent hardware.
There no doubt nvidia would of given them a great deal for millions of units of great console gpu…..
They just didn’t want to pay anything less then bottom of the barrel and AMD was the cheap harlot to service them.
Going with nvidia would have
Going with nvidia would have meant sourcing the CPU from elsewhere and that would have brought its own headaches. The APU made sense and I can see more APUs in the console future, hell even intel’s iGpu might be good enough come ps5 xb2 time.Do we know for a fact that ubisoft is developing on the console and not port from pc? From what I’ve seen of these title, the visuals are not that compelling…..
i think using CPU from
i think using CPU from another company and use gpu from another company will be a problem and it has been done before. because in the end it will be a custom chip with it’s very own software. it just to happen that AMD can supply both.
lol like they would have been
lol like they would have been able to put an expensive i5 and the Nvidia Maxwell that just was released.
If that was even possible no one would buy a console with i5 and maxwell for what they would cost to build. MS and sony had to sell them at a loss which is bad business today when people play more on their phones and tablets which sell for cheap.
The coders have to make better code it is as simple as that. Few of them seem to know how to code for even 4 cores well. Time for them to stop whine and get the skills. Single core performance hardly scale anymore with new chips. Coders are simply late to the party.
Maybe VISC will solve that
Maybe VISC will solve that problem.
http://wccftech.com/amd-invest-cpu-ipc-visc-soft-machines/
Most of those who use Steam
Most of those who use Steam use PCs with (much) lower specs than those in the consoles. This talk about consoles hitting the wall is rubbish. Even the highest end PC will hit the wall today if you go and use ultra settings in the highest resolution possible. What does this mean? Nothing.
Developers are simply doing what everyone of us would do the last 15+ years with a game in our PCs. We will go in the graphics options and try to find the best mixture of graphical quality and resolution to play the game keeping a minimum frame rate. Games will always look like they hit the wall on Xbox One and PS4, because in every game settings will always be set to use the maximum potential of the consoles.
I don’t really get it, why all this fuss for something that it is trivial and normal for over a decade in PC gaming? Everyone seems so surprised having rediscovered the wheel. Maybe Intel and Nvidia need to convince the game console makers to buy their must more expensive hardware for the next Xbox Two and PS5.
Never forget in who’s
Never forget in who’s paycheck Ubisoft is. So you can expect anonymous developers to bush the consoles all day for every day.
Ubisoft signed a contract
Ubisoft signed a contract with Microsoft for AC Unity.
The Xbox One is the leading platform for AC Unity. This forces Ubisoft to make the best gaming experience on the Xbox One. Instead of a DLC exclusive for the Xbox One they locked-down the PS4 version to Xbox One levels. Ubisoft is not in a position to be honest and open about this matter. Ubisoft PR-department trying to keep the gamers calm on this matter but instead they made it worse (Ubisoft has the habit of doing this)
Also Ubisoft rushes games to be ready at the set deadline, not allowing the games to be delayed in a late state. This compromises the performance that is possible on the hardware. With more time and dedication greater performance was absolutely possible. For PC gamers this is more then ever the case despite the fact that Ubisoft titles are “Nvidia Gameworks”. Most Ubisoft games on PC are released in beta state. Ubisoft also doesn’t give AMD and Nvidia time for release to proper optimize the game. This is also very stressfull for Nvidia and AMD driverteam to fix this worst-case scenario when the game is (almost) out.
Ubisoft makes great games but also does a lot of things horribly wrong. There PR-department needs a clean sweep. And they need to hire more technical people for there game engines, meeting Crytek and Unreal standards…
The graphics hardware is matured now (no revolutions, only evolutions), to utilize the great power a top-tier game engine more important than ever.
FUD. Typical Ubishit FUD.
FUD. Typical Ubishit FUD. Nothing less or more than just a FUD.
Unity “hit peak” because it’s, in actuality, absolutely horrendously optimized, there are massive memory leaks and tons of garbage code, akin to those that can be seen in “Hitman – Absolution” and “Metro – Last Light”. Unity is coded just as poorly as Absolution and Last Light was. Ubishit…they’re spreading this FUD simply for one reason only – because they know their game is fugly trash optimization and content-wise (the game weights 50GB and requires “2500K+GTX 680”-combo for MINIMAL graphical settings at 1080p? That is ABSOLUTELY clear garbage-coding, performed by people who have hands growing out of their ass).
If you’re seriously believing all of this FUD from Ubicrap, you must be a completely brainwashed zombie.
My first thought when the new
My first thought when the new consoles were introduced was “bad timing”. 20nm process wasis not ready for big chips, and 28nm is too old. The consoles were stagnant from the start. Had they been released two years earlier or 1 1/2 years later, they may have enjoyed a nice honeymoon period.
Call me crazy, but something
Call me crazy, but something about this story smells funny. And I’m speaking from the Ubisoft side, not necessarily from Ryan’s deductions.
I can’t help but feel that this current 1080p/900p dust up with AC Unity lies more with Ubisofts engine than to solely lay all the blame on both console’s internals. Even on the best of PC’s, Black Flag didn’t run as well as it could have.
The consoles of this
The consoles of this generation have become – with x86 architecture, a somewhat decent amount of memory and hard drive storage, and a “bloated” PC-ish OS – much more PC-ish than all previous generations before, but I think that is one point where the consoles should follow more the PC: power design and thermal design.
Many people complain about the weak hardware of the consoles, but at the same time their suggestions (with HD7950 or HD7970 based GPU) would be incredible expensive.
But with better cooling and a more powerful PSU (although the PS4’s PSU should be able to support my suggestions), the hardware manufacturer’s might have been able to unleash a considerable amount of “free” additional performance.
The PS4’s graphic processing part of the APU is based on the Pitcairn GPU with just two compute units disabled. Now thinking back about a year, what GPU was always recommended for a relatively cheap gaming build intended for 1080p gaming?
The R9 270X. Usually the R9 270X is clocked at 1,000 MHz or more. The PS4 and the Xbone use – like mobile GPUs normally do – much lower clock speeds… sitting at about 800 MHz.
A full R9 270X has up to 2.7 TFlops. At 1,000 MHz the PS4’s APU would then deliever exactly the performance wanted by the engine devs of Epic for the Unreal 4 engine at 1080p. Furthermore a R9 270X Toxic at 1,150 MHz in combination with a Core i7-3960X uses only 252 Watt at full load. A PS4/Xbone with less compute units and more energy efficient processor cores should be able to handle the power consumption with stock PSU.
The processor part of the APU also could have been beefed up a bit. The desktop Athlon APUs with Jaguar cores can usually reach clock rates of about 2.5 GHz with air cooling and nearly no increase in voltage. Stock out of the box the Athlon 5350 reaches 2.05 GHz.
I think Microsoft and Sony could have done a better job with only a minor increase in price.
Btw about the consoles being
Btw about the consoles being more PC-ish:
On the PS4 only the following hardware can be utilized by game developers:
– 6 CPU cores (1 is reserved for OS, 1 is disabled by default)
– 5 GB RAM
– 1 MiByte L2 cache per cluster
This isn’t entirely correct.
This isn’t entirely correct. All eight cores are enabled and active, it’s the GPU clusters that have disabled shaders to improve yields. The PS4 has between 4.5 and 5.5GB of DDR5 memory to use for games, and that ceiling will increase with time as Sony works to optimise their OS more. The Xbox One has a definite 3-5GB split and it will be more difficult to shift that around because they are using a hypervisor to split the OS apart from the game OS.
http://www.anandtech.com/show/7546/chipworks-confirms-xbox-one-soc-has-14-cus
http://www.extremetech.com/extreme/171375-reverse-engineered-ps4-apu-reveals-the-consoles-real-cpu-and-gpu-specs
http://www.geek.com/games/ps4-gives-5-5gb-of-ram-to-games-out-of-8gb-1563583/
There has been an official
There has been an official presentation from Naughty Dog – a Sony owned studio – in which they told to other game developers about the Playstation 4’s hardware and how to make the best use of it.
I am aware that the Die photographs from your links show the full 8 cores (as 8 cores are physically present on the PS4’s APU), but they clearly stated that only 6 of the 8 cores can be utilized by game developers.
One of the other two cores is exclusively used by the OS. I assumed that the increasing the yield was the reason for the other core not being usable by game developers, as this was essentially done with the CELL processor of the PS3 (one SPU was always disabled).
The disabled GPU compute units can also be found on the Die, so it is kind of hard to see what the last core is doing. But maybe it is also reserved for the OS…
Ok, apparently two cores are
Ok, apparently two cores are used for OS… http://ps4daily.com/2013/07/playstation-4-os-uses-3-5-gb-of-ram/
I thought so. two cores
I thought so. two cores reserved for the OS and peripherals. Perhaps they’re waiting to see what the takeup on PS Eye is before they remove the requirement to dedicate a single core to it (if they are dedicating a single core, or most of a single-core’s CPU cycles, to operating the camera if you have one).
Btw about the consoles being
Btw about the consoles being more PC-ish:
On the PS4 only the following hardware can be utilized by game developers:
– 6 CPU cores (1 is reserved for OS, 1 is disabled by default)
– 5 GB RAM
– 1 MiByte L2 cache per cluster
Im sure there is a ceiling
Im sure there is a ceiling atm, but I would NOT put it past UBISOFT to be a bit over estimating thier optimization. PC AC Black flag was HORRIBLY optimized, and then there was watch dogs….
I don’t think it was a person
I don’t think it was a person from ubisoft. I also think 8-10 year console cycle is wrong.
I think Sony does not want to invest to far into the future as it takes them to long to get the profit they would like back from their investment.
I also think 4K adoption in about 5 years when things become price cheaper and their is enough products on the market.