Impressions, Possible Bugs, and Conclusion

Impressions:

My initial impressions closely mirror what was reported by one of our G-Sync contest winners. If you're going from 60 Hz panels, there is a *huge* drop in latency and a corresponding increase in fluidity when gaming at 120+ Hz refresh rates, especially if the game FPS sits that high in the range. The higher you go on refresh, the harder it is to see tearing, even with lower FPS coming from the game and with vsync off. In those cases, torn frames are only on-screen for a fraction of the time, quickly replaced with a non-torn version of the new frame on the very next scan (only 7 msec later when refreshing at 144 Hz). At those speeds, tearing goes from smacking you in the face to you having to look for it.

Even though higher refresh rates attempt to solve the tearing issue, they do not correct the aggregate latency that comes from the GPU completing frame renders out-of-sync with display refreshes. If you've experienced the difference in feel going from a 60 to 120+ Hz panel, the best way to describe the transition to G-Sync is as a half-again gain over that 60-120 Hz change in feel – where those perceptions mainly apply to gaming above 120 Hz. If all games ran at 120+ Hz without issue, there would be much less of a need for G-Sync, but that's not reality as game developers tend to target 60 FPS.

With G-Sync eliminating the tearing that is so painfully obvious at 60 Hz rates, while keeping latency lower than even vsync-off levels, the effect of G-Sync becomes increasingly apparent as you drop towards and even below 60 FPS. The difference is immediately apparent on just about any game with settings adjusted to achieve a 40-60 FPS range. Latency improvement is a half-again increase when compared to a vsync-on/off toggle on a 60 Hz display, and every frame is drawn as if vsync were on. Games feel just as playable and responsive at ~75% of the FPS goal you'd have to set on a non-G-Sync display.

When tuning game settings for improved image quality, I got a bit overzealous and dialed everything up to eleven in Crysis 3 and Metro Last Light, and everything was still surprisingly responsive even at ~20 FPS (I'm on a GTX 680). Panning around a scene went from a laggy mess to what I can best describe as a feeling that the display was 'leading the mouse'. Now I wouldn't recommend trying to game at this low of a refresh since render times add their own latency, but it remains a definite and demonstrable improvement over the old tech.

The 30 Hz concern:

After my tinkering I read about a supposed issue where G-Sync 'reverted back to vsync-on' at levels below 30 Hz. This surprised me, especially after testing as low as 16 Hz with no noticeable lag issues (beyond that expected from increased render time at such a low FPS). I re-tweaked settings to shoot for a value hovering around 30 FPS and then I suddenly started to notice something odd. I'll start with our theory as to the cause and work towards the effect:

We know from our talks with NVIDIA that the LCD panel must be refreshed at no lower than 30 Hz (33 msec) intervals. G-Sync forces an additional redraw of the same frame if the new render does not complete within 33 msec. Assuming the panel has just forced a re-display of the frame, the next frame would have to wait until that was complete (7 msec assuming 144 Hz). This creates a 'dead zone' window of time between 33 msec (30 Hz) and 33+7 msec (~25 Hz). Frames rendered at a rate within that 25-30 FPS window would in theory be forced to wait for that additional redraw to complete before they could occur. Depending on the game, video settings, and GPU speed, a scene may intermittently dip through this dead band and therefore introduce an intermittent latency, which would be worst (7 msec) at just below 30 FPS. Once you get all the way down to 25 FPS, the issue should subside completely, which would explain why my testing at 16-20 FPS seemed fine. Bouncing various GPU heavy games I found a batch of settings for Metro Last Light that dipped through the theoretical 'dead band' several times within the same benchmark session. Here is the picture to go with the thousand words written above:

Note the distinct plateaus across the 25-30 FPS band. The center area of unhappiness feels like intermittent lag and introduces a judder-type effect on smooth pans occurring within that 25-30 FPS range.

I suspected this issue might be correctable by having the driver dynamically adjust the forced refresh to shorter delays when rendering within the 30 FPS range. After further discussion with NVIDIA, the 30 FPS plateau was a design decision for this first generation of G-Sync as it applies to this specific ASUS panel. Future iterations may still exhibit this quirk, but at longer forced refresh wait time and therefore would push the dead band to even lower FPS ranges. NVIDIA designed G-Sync around the 40-60 FPS target and it's worth noting that FPS can even dip into the 30's without issue. The reduced latency of G-Sync may cause a tendency towards cranking video settings up further than they normally would, and I believe that is why so many folks are reporting this effect at <30 FPS. For the moment I'd recommend shooting for the 40-60 FPS range, with the hope that future G-Sync panels will be able to force this band down below 20 FPS. This should reduce the possibility of entering the band in even the most demanding scenarios.

Conclusion:

I'm extremely impressed with the build quality of the G-Sync installation kit, as well as the product and technology itself. The installation should be possible for any do-it-youselfer, and the included documentation took away all of the uncertainty venturing into unknown waters. My biggest gripe / hope is to see this technology applied to larger, more color-accurate panels. I'd pay good money for a 2560×1600 IPS G-Sync panel, even if the max refresh rate was 100 or even 85 Hz.

« PreviousNext »