Last Wednesday we reported on the announcement of the Crysis 2 DX11 patch and high resolution texture pack upcoming for the 27th of June. Looking at the calendar it appears as if your graphics card just ran out of time to rule the roost. Clocking in at 546 megabytes for the DirectX 11 update and 1695 megabytes for the high resolution texture pack the new updates are not small especially since that does not include the size of the 1.9 patch itself. The big question is whether these updates will push the limits of your computer, and if so, is it worth it?
Can you run me now? … Hello?
VR-Zone benchmarked the new updates on an Intel Core i7-965 system paired with an NVIDIA GeForce GTX 580. We believe they accidentally mislabeled their Extreme Quality benchmark with their Ultra Quality benchmark as the ultra is the more intensive of the two settings; also, ultra should have the biggest difference between DX9 and DX11 settings as DX11 effects are not enabled at the extreme settings. ((Update: 6/28/2011 – That’s exactly what happened. VR-Zone fixed it; it is correct now.)) Under that assumption you are looking at approximately 40 FPS for a 1080p experience with that test system and all the eye-candy enabled. That is a drop of approximately 33% from its usual 60 FPS under extreme settings.
But how does it look? Read on for all of that detail.
Okay I believe I have teased you long enough (clicking one extra link, oh noes!) and it is time for Crytek to showcase the game. Be sure to full screen and switch to high definition to get the best idea of what we are talking about.
I don’t think this will end up on the 360.
Okay so concentrate on my voice. Are you back with us? Good. We will now go over the features in detail.
The most known addition to DirectX 11 and the first non-advertisement of the video is tessellation and displacement mapping. Actual geometry is impossible to represent on a computer so we need to approximate it and for that we typically use flat triangles. The logic is sound however the problem comes about if a triangle is larger than a pixel (or subpixel for Antialiasing), often much larger than a pixel, then we can see the flatness of the geometry. We have been struggling to discover methods to represent more geometry with less memory and computational usage ever since. The main methods we have come across thus far in videogames are:
- Simply leaving geometry as all flat
- Simple shading between “smooth” vertexes
- Bump mapping, a texture of “high” and “low” points on a triangle
- Normals mapping, basically a bump map with directional information (not just “high” and “low”) for other tricks
- Parallax/Occlusion mapping, normal mapping with a height texture so a material can occlude itself, like fins on a grill.
- Tessellation/Displacement mapping, break the triangle down to smaller triangles and shifts them based on a height texture.
The latest addition, Tessellation/Displacement mapping, is the first method to allow better smoothing when viewed edge-on all without sacrificing extra in-surface detail. Every other pervious method would help a cylinder look smoother side-on, but a hexagonal cylinder will look like a hexagon when seen face on except now. A hexagon can be tessellated to an octagon, decagon, hectogon, and so forth. Another advantage of displacement mapping is that, on top of being more efficient than simply using the higher detail geometry, you can decrease the amount of tessellation as the object gets smaller on screen.
Another major advantage though not quite as lauded by Crytek is the advancement in tonal mapping. People have long since complained about the negatives of HDR due to its lack of perceived benefit. In the old days of videogames developers used to tweak their engines for semi-accurate lighting indoors or outdoors. For an example of this, look at Doom 3 outdoors and Battlefield 2 indoors. Halo 2 actually used two different lighting profiles and switched back and forth as you moved inside and outside. HDR was invented to allow the engine to understand all ranges of lighting and change its scale as necessary. Now that we have all that extra lighting information, tone mapping allows us to see details in the very bright as well as the very dark creating a much more rich image.
Tone mapping enhancements off
Tone mapping enhancements on
Another advancement, although somewhat minor, is a more accurate and artist-driven depth of field blur. Real depth of field is not a Gaussian blur; it actually depends on the aperture and the lens. If the aperture is not circular but rather a polygon due to the mechanics of the camera it will tend to create distortions within the blur that resemble the shape of the aperture. Like I said it affects the blur itself too, but that is the most noticeable part of it. Now an artist can create a custom “bokeh” and have that drive the blur. In the following screenshot, the artist appeared to use an octagonal bokeh to pretend to be coming from a camera with an octagonal aperture.
Bokeh depth of field, octagonal bokeh pointed in red
Water rendering, always a notable virtue of the CryEngine, also took another leap forward with water physics effects as well as subsurface scattering. Subsurface scattering imitates the effect that happens when light enters a translucent material and scatters inside it, potentially even scattering back out of the surface. The most noticeable times when this effect happens are with skin as well as large bodies of water. This effect has been faked so many times in the past including with Photoshop “render clouds” tricks for skin textures and colored fog just under the surface of water but now it is much more real.
Above water, greenish subsurface scattering circled in red
Shadowing also took another major leap forward due to the addition of variable penumbra which is a blurring effect that occurs due to the size of a light. If one side of a light source can light up an object but the other side is blocked by an in-between object you will get some shadowing but still some light added to the object. Originally people used to blur shadows and cross-fade between the two extremes but now it is again a bit more directly performed.
Shadows with penumbra
Various blurs were also pretty difficult in DirectX 9 due to the lack of control when using HDR rendering. HDR rendering is known to cause serious problems (such as below with antialiasing in Bioshock; errors occurring between surfaces lit by very different intensities, red; but properly performed otherwise, blue) due to the majority of the process being hidden from the programmer in the DirectX API. Older motion blurs tended to be very streaky and not convincing but now with DirectX 11 artists can do better.
Bioshock Antialiasing (HDR limitations)
Crysis 2 HDR-friendly motion blur
Lastly particles were also given a bit of a facelift with motion blur and shadows. Not much to say about them, they just look prettier.
So I guess this means the Crysis 2 will finally be the game, for now, that you will use to show off your PC’s horsepower. Put that slacker to work!