DX12 Support, GameWorks VR, G-Sync Updates
Even though the GTX 980 Ti doesn't offer any new features compared to the GTX Titan X with the same GM200 GPU behind it, NVIDIA did touch base with us on a few topics of interest; updating me on the current news and hype surrounding DX12's new features, some potential improvements to VR rendering with something called multi-res shading and updates to the G-Sync monitor ecosystem.
Some News on DirectX 12
First up, let's talk about DirectX 12. As we near the release of Windows 10 this summer you'll hear more about DX12 than you could ever imagine, with Intel, AMD and NVIDIA all banging the drum loud and clear. At this point, not everything can be divulged but NVIDIA wanted to be sure we understood that there were two very different aspects of the DX12 story: better CPU utilizing and efficiency along with new features that require new hardware. We have all heard the stories (ours included) talking about the backwards compatibility of DX12 for currently shipping GPUs, but that only accounts for the improved CPU utilization and efficiency portions of DX12. While that is critically important, there are indeed new features that require new GPU hardware to take advantage of, just like in all previous DirectX releases.
In terms of new features, there are currently two different feature levels: Feature Level 12.0 and Feature Level 12.1. Feature level 12.0 supports new rendering technologies like tiled resources, bindless textures and typed UAV access. 12.1 is more advanced and includes the 12.0 features but adds conservative raster and raster ordered views.
NVIDIA says that GM200 supports another feature as well: volume tiled resources. This additional feature brings support for 3D textures to be used in the tiled resource capability, utilizing less memory by only storing the specific tiles of textures required for rendering at that time. The tiled resources feature listed as a requirement for Feature Level 12.0 only needs to support 2D textures. With a 3D texture though a developer has the ability to store an additional dimension of data; NVIDIA gave an example of smoke where the third dimension of texture might indicate the pressure of the fluid, changing the color and response of the physics based on that 3rd dimension of data.
Conservative raster improves pixel coverage determination moving away from specific sample points and instead will register as covered if any portion of the pixel is covered by the geometry in question. This does come with some kind of performance penalty, of course, but it has the ability to improve coverage recognition for better image quality with new rendering techniques. NVIDIA gave the example of ray traced shadows that are free of aliasing image quality issues.
There is a still lot yet to be shown or discussed about DX12 but we can confirm now that Maxwell will support DirectX 12 Feature Level 12.1 as well as the volume tiled resources capability. I'm sure we'll hear AMD's side of this story very soon as well and hopefully some news from Microsoft this summer will help us better understand the overall direction of the API.
GameWorks VR
Released previously with the initial sale date of the GeForce GTX 980 and GTX 970, NVIDIA's collection of software and hardware technologies focused on VR is now being branded GameWorks VR. Many of the technologies have been discussed before including VR SLI, Direct Mode and Front Buffer Rendering, but NVIDIA is introducing a new option to improve VR performance called Multi-res Shading.
To understand it, a quick-and-dirty back story: A VR headset includes a screen and a pair of lenses. The lenses are used to make the screen appear to be further in the distance to help users properly focus on the image. Those lenses warp the image slightly, making the center of the image larger and compresses it around the edges to produce a fish-eye style effect and a wide viewing angle. To account for this optical warping, the display renders a fish-eye looking image of its own – one you have probably seen if you've looked at any footage recorded from an Oculus Rift. The rendered image is "warped" by the optical lens so that it appears correct to the end user.
The problem is that warping of the image in the rendering engine means that resolution is effectively lost around the edges of the screen. Though the middle retains a near 1:1 pixel ratio, the edges do not and you can tell from the diagrams below that much of the pixel data is lost when the image is "warped" and "compressed" to fit the required shape for the VR displays. This makes the rendering of VR games less efficient than rendering for traditional 2D monitors and displays, a disadvantage that can have negative side effects in a computing platform that requires the fastest possible performance and lowest available latency.
NVIDIA's solution is Multi-res Shading that can divide the image that the game engine wants to display into nine different viewports. The center viewport, the one the user will 99% of the time be focused on and the one without lost pixels to the warping, remains the same resolution and maintains the detail required for a great gaming experience. However, the surrounding viewports can be adjusted and sized to more closely match the final warped resolution that they will display at in the VR headset. The image still has to go through a final warp to the correct shape for the VR lens but the amount of data "lost" along the edges is minimized and thus performance can be improved for the gamer.
Maxwell GPUs have a multi-projection capability built into them that accelerates the distribution of geometry to the different viewports in Multi-res Shading and NVIDIA claims it can offer a 1.3x – 2.0x improvement in pixel shader performance (not frame rates necessarily). This kind of projection can be adjusted with different levels of detail so that developers have the ability to decide how much lower the resolution rendered becomes based on their performance goals and image quality requirements. I saw of demo of the technology at work during our GTX 980 Ti briefing and even when specifically looking on the periphery for the lower resolution and detail level along the edges of the render, it was nearly impossible to spot the difference.
This is a technology that requires game engine implementation, so don't go thinking you'll be able to just flip a checkmark in the control panel for this. It will be packaged up in the GameWorks SDK and we'll keep an eye out for game engines and developers that attempt to integrate to measure performance gains and image quality for ourselves.
G-Sync Updates: New Monitors and Overdrive Details
I think anyone reading this review or someone that frequents PC Perspective will already know the basics of G-Sync and what it adds to the smoothness of PC gaming. It's definitive, it's dramatic and it's available now in several different variations.
One area that has seen a lot of debate recently between G-Sync and AMD's FreeSync competition is in the area of overdrive. When Allyn and I first tested the initial wave of FreeSync monitors to make their way to office we immediately noticed some distinctive ghosting on the screens when operating in the variable refresh modes of the monitors. At the time, very little was known about overdrive with variable refresh displays and though I knew that NVIDIA was doing something with G-Sync to help with ghosting, I didn't know exactly what or to what extent. During our briefing with Tom Petersen on the GTX 980 Ti, he was able to share some more details.
At its most basic, monitor overdrive is the process of asking a pixel to go to a higher or lower voltage value than you actually need for what is being presented on the screen in order to get that pixel to your desired power level (color and brightness), faster. This helps move pixels from their previous color/brightness to the new color/brightness you want them to be at more quickly, thus reducing ghosting. With traditional monitors and fixed refresh rates, panel vendors had perfected the ability to time the twisting and untwisting of LCD crystals to account for overdrive.
But with variable refresh rates that all gets turned on its head; the rate at which you apply power to a pixel at a 40 Hz refresh is very different than at 80 Hz, for example, if you are trying to produce a picture without ghosting and without inverse ghosting. NVIDIA claims that because the G-Sync module in the desktop monitors is tuned specifically to each panel, it is able to dynamically apply overdrive based on the variable frame rate itself. Because the module knows and can estimate the time the next frame will appear (at the most basic level, guess whatever the previous frame time was) it can more intelligently apply voltage to the panel to reduce ghosting and give users the best possible picture. Petersen said that it appeared none of the FreeSync monitors were doing this and that is why we see more ghosting at variable refresh rates on those displays.
With the NVIDIA 352.90 driver the company will also be adding a couple of requested features to G-Sync as well: windowed mode and V-Sync options above the displays maximum refresh rate. For those gamers that like to use windows mode and borderless windowed mode to play with other things going on on your display or even other monitors, NVIDIA has found a way to work with the DWM (desktop windows manager) to allow non-full-screen games to operate in a variable refresh mode. NVIDIA is likely doing some behind the scenes trickery to get this to work properly inside the Windows compositing engine, but we saw it in action and VRR operation is controlled by the application in focus. If you move from the game to a browser, for example, you'll essentially return to the static refresh rate provided by the windows display model.
Two new options found there way into the control panel as well: G-Sync will now let you set V-Sync on or off above the maximum refresh rate of the panel (hurrah for peer pressure!) and you can now enable ULMB directly. This change to the V-Sync capability is only available at the high side of the monitor's refresh rate and will let a user disable V-Sync (and thus introduce horizontal tearing) in order to gain the biggest advantage possible with the lowest latency the system can muster. This basically matches what AMD has done with FreeSync though NVIDIA's G-Sync still has a superior implementation of low frame rate technology as we demonstrated here.
Oh, and NVIDIA G-Sync Mobile is now a thing! Just as we showed you back in January with a leaked driver and an ASUS notebook, module-less G-Sync is a reality and will be shipping this summer. Check out this news story for more details on the mobile variant to G-Sync.
With those educational tidbits in mind, maybe more interesting is that new G-Sync monitors are incoming with new aspect ratios and different specifications than we have seen before.
Acer has four new displays on the horizon with Asus adding three more to the mix. In that group are a pair of 4K IPS 60 Hz G-Sync monitors, a 34-in 3440×1440 IPS screen with a 75 Hz maximum refresh as well as an updated ROG Swift with a 2560×1440 resolution, 144 Hz IPS screen. I am really eager to get my hands on the Acer X34 with the curved 21:9 screen and 75 Hz refresh – that could be the pinnacle of gaming displays for the rest of 2015.
To bad TSMC didn’t have
To bad TSMC didn’t have better Fabs then just using 28nm that’s been around well over two years. The EVGA GeForce GTX 670 seems to be holding up for the games I play and its 28nm. Nvidia has amazing graphics cards and this can be said for the GeForce GTX 980 Ti, honestly if the chance was there, I would have two of these in SLI.
Nonetheless I’ll be keeping a eye open for Volta. Yes I’ll wait that long especially with it taking on the full 3D HBM IC chips all running on 14nm Silicon.
Twitter
Still noticing that no review
Still noticing that no review site seems to have the balls to put the 980TI up against Arma 3 Maxed out. or Maxed out at 4K.
My i7-4790k gtx 980 Ti build
My i7-4790k gtx 980 Ti build gets about 38 FPS on the altis benchmark. (everything maxed view distance set to 6500 and objects set to 1600) If you mostly play on small maps like altis or arma 2 maps like takistan it should run alot better. and even that benchmark is kind of a worst case scenario with lost of explosions and smoke effects.
It is a pity that the new GTX
It is a pity that the new GTX 980 TI doesn’t have a backplate as GTX 980 reference.
FCAT??? Why not?
FCAT??? Why not?
Is there any reason to wait
Is there any reason to wait until the non-reference versions come out? Or is it safe to pull the trigger now?