For an engine that was released in late-March, 2014, Epic has been updating it frequently. Unreal Engine 4.9 is, as the number suggests, the tenth release (including 4.0) in just 17 months, which is less than two months per release on average. Each release is fairly sizable, too. This one has about 232 pages of release notes, plus a page and a half of credits, and includes changes for basically every system that I can think of.
The two most interesting features, for me, are Area Shadows and Full Scene Particle Collision.
Area Shadows simulates lights that are physically big and relatively close. At the edges of a shadow, the object that casts the shadow are blocking part of the light. Wherever that shadow falls will be partially lit by the fraction of the light that can see it. As that shadow position gets further back from the shadow caster, it gets larger.
On paper, you can calculate this by drawing rays from either edge of each shadow-casting light to either edge of each shadow-casting object, continued to the objects that receive the shadows. If both sides of the light can see the receiver? No shadows. If both sides of the light cannot see the receiver? That light is blocked, which is a shadow. If some percent of a uniform light can see the receiver, then it will be shadowed by 100% minus that percentage. This is costly to do, unless neither the light nor any of the affected objects move. In that case, you can just store the result, which is how “static lighting” works.
Another interesting feature is Full Scene Particle Collision with Distance Fields. While GPU-computed particles, which is required for extremely high particle counts, collide already, distance fields allow them to collide with objects off screen. Since the user will likely be able to move the camera, this will allow for longer simulations as the user cannot cause it to glitch out by, well, playing the game. It requires SM 5.0 though, which limits it to higher end GPUs.
This is also the first release to support DirectX 12. That said, when I used a preview build, I noticed a net-negative performance with my 9000 draw call (which is a lot) map on my GeForce GTX 670. Epic calls it “experimental” for a reason, and I expect that a lot of work must be done to deliver tasks from an existing engine to the new, queue-based system. I will try it again just in case something changed from the preview builds. I mean, I know something did — it had a different command line parameter before.
UPDATE (Sept 1st, 10pm ET): An interesting question was raised in the comments that we feel could be a good aside for the news post.
Anonymous asked: I don't have any experience with game engines. I am curious as to how much of a change there is for the game developer with the switch from DX11 to DX12. It seems like the engine would hide the underlying graphics APIs. If you are using one of these engines, do you actually have to work directly with DX, OpenGL, or whatever the game engine is based on? With moving to DX12 or Vulcan, how much is this going to change the actual game engine API?
Modern, cross-platform game engines are basically an API and a set of tools atop it.
For instance, I could want the current time in seconds to a very high precision. As an engine developer, I would make a function — let's call it "GetTimeSeconds()". If the engine is running on Windows, this would likely be ((PerformanceCounter – Initial) / PerformanceFrequency) where PerformanceCounter is grabbed from QueryPerformanceCounter() and PerformanceFrequency is grabbed from QueryPerformanceFrequency(). If the engine is running on Web standards, this would be window.performance.now() * 1000, because it is provided in milliseconds.
Regardless of where GetTimeSeconds() pulls its data from, the engine's tools and the rest of its API would use GetTimeSeconds() — unless the developer is low on performance or development time and made a block of platform-dependent junk in the middle of everything else.
The same is true for rendering. The engines should abstract all the graphics API stuff unless you need to do something specific. There is usually even a translation for the shader code, be it an intermediate language (or visual/flowchart representation) that's transpiled into HLSL and GLSL, or written in HLSL and transpiled into GLSL (eventually SPIR-V?).
One issue is that DX12 and Vulkan are very different from DX11 and OpenGL. Fundamentally. The latter says "here's the GPU, bind all the attributes you need and call draw" while the former says "make little command messages and put it in the appropriate pipe".
Now, for people who license an engine like Unity and Unreal, they probably won't need to touch that stuff. They'll just make objects and place them in the level using the engine developer's tools, and occasionally call various parts of the engine API that they need.
Devs with a larger budget might want to dive in and tweak stuff themselves, though.
Unreal Engine 4.9 is now available. It is free to use until your revenue falls under royalty clauses.
dat diagram tho
dat diagram tho
Yeah it wasn’t my best work
Yeah it wasn't my best work lol.
Looking forward to More
Looking forward to More Vulkan demos, and the Steam OS versions!
After a year and a half of
After a year and a half of demoing the DX12 support they are barely at the experimental phase with it.
New: Experimental DirectX 12 Support
DirectX 12 is now supported as an experimental feature! If you are using Windows 10, try it out by running the engine with “-DX12” on the command line.
With respect, the Unreal Devs
With respect, the Unreal Devs have let Microsoft develop the DX12 version, and it has existed as a separate branch from the main Engine. Essentially, they have merged the changes to that branch into the git master branch, and fixed the bugs they have encountered. But Epic themselves have not been testing it and working with it on a continual basis for more than a year, and are not ready to say that it is ready on a new Graphics layer for a new OS that is still having teething problems with said layer.
They will promote its use after they have let the thousands of projects that use the Engine have banged on it for a bit.
So…does this mean that UE 5
So…does this mean that UE 5 is basically around the corner?
Considering that 90% of all recently announced UE4-based games are still yet to be released one to three years from now, this is INSANE if that is what’s taking place.
Nope. 4.10 is the next
Nope. 4.10 is the next version, which will probably land in a month or two.
“4.10”? HAOH do you count
“4.10”? HAOH do you count that? Or are they actually in all seriousness go by .99?
It’s not a decimal number,
It's not a decimal number, even though it's period-delimited. It's also not the only place where this happens. As an example, "127.0.0.1". 4.10 would be read "four dot ten", though.
I don’t have any experience
I don’t have any experience with game engines. I am curious as to how much of a change there is for the game developer with the switch from DX11 to DX12. It seems like the engine would hide the underlying graphics APIs. If you are using one of these engines, do you actually have to work directly with DX, OpenGL, or whatever the game engine is based on? With moving to DX12 or Vulcan, how much is this going to change the actual game engine API?
Modern, cross-platform game
Modern, cross-platform game engines are basically an API and a set of tools atop it.
For instance, I could want the current time in seconds to a very high precision. As an engine developer, I would make a function — let's call it "GetTimeSeconds()". If the engine is running on Windows, this would likely be ((PerformanceCounter – Initial) / PerformanceFrequency) where PerformanceCounter is grabbed from QueryPerformanceCounter() and PerformanceFrequency is grabbed from QueryPerformanceFrequency(). If the engine is running on Web standards, this would be window.performance.now() * 1000, because it is provided in milliseconds.
Regardless of where GetTimeSeconds() pulls its data from, the engine's tools and the rest of its API would use GetTimeSeconds() — unless the developer is low on performance or development time and made a block of platform-dependent junk in the middle of everything else.
The same is true for rendering. The engines should abstract all the graphics API stuff unless you need to do something specific. There is usually even a translation for the shader code, be it an intermediate language (or visual/flowchart representation) that's transpiled into HLSL and GLSL, or written in HLSL and transpiled into GLSL (eventually SPIR-V?).
One issue is that DX12 and Vulkan are very different from DX11 and OpenGL. Fundamentally. The latter says "here's the GPU, bind all the attributes you need and call draw" while the former says "make little command messages and put it in the appropriate pipe".
Now, for people who license an engine like Unity and Unreal, they probably won't need to touch that stuff. They'll just make objects and place them in the level using the engine developer's tools, and occasionally call various parts of the engine API that they need.
Devs with a larger budget might want to dive in and tweak stuff themselves, though.
In other words great for lazy
In other words great for lazy developers who prefer just changing textures on franchises.
Or for Game Designers that
Or for Game Designers that don’t know how to code.
To some extant this is true,
To some extant this is true, the game designer can take the stance that they don’t care about the underlying platform. In practise, this leads to headaches later on.
Understanding what the underlying platform is needing allows you to take advantage of that layer, even in an Engine such as Unity, where you don’t have access to the source code.
In the case of UE4, one of the main draws is that access to the source of the Engine, so you can go in and tweak the engine to handle your use case better. And knowing the data and functions that the DX12 and Vulkan API handle better allows the developer to tailor the game data for that use case.
This could manifest itself in more particles, perhaps, or more varied 3D models, since the context switch to a new object is much less expensive, as examples off the top of my head.
The fact that both of those APIs leave more of the CPU free means you can add computationally intensive features that you could never consider before, thus AoTS’s every single unit in its game has its own AI, as an example from a different game. And with the source code of UE4, this type of game specific optimization is possible.
Building a game that uses effects that take advantage of the Next Gen APIs is probably the simplest use case that could be thought of. As a case in point, the Distance Field stuff has been in UE4 for several versions, but is just now performant enough to allow it to be used in a shipping game, without hand optimizing it yourself. Just having that available changes the choices some games will make. DX12 and Vulkan will enable more things to be possible, so that new choices will be available for games to use.
If you make your game based on just the Engine, you will run into pitfalls, and slowdowns were you have to figure out why something is happening, then optimize your use case for the Engine, or optimize the Engine for your usecase. UE4 alows you to do both.