Inside and Outside the VRR Window

Discussing AMD FreeSync Experience Zones – Inside and Outside the VRR Window

Okay, so I just made that term up: Experience Zones, but it seems to make sense. There are three distinct areas we want to compare the gaming experience on between FreeSync and G-Sync monitors to get a good gauge for how each device fits the claims they make. First, we need to look inside the VRR window, the zone in which the monitor and AMD claims that variable refresh should be working without tears and without stutter. On the LG 34UM67 for example, that range is 48-75 Hz, so frame rates between 48 FPS and 75 FPS should be smooth. Next we want to look above the window, or at frame rates above the 75 Hz maximum refresh rate of the window. Finally, and maybe most importantly, we need to look below the window, at frame rates under the minimum rated variable refresh target, in this example it would be 48 FPS.

AMD FreeSync offers more flexibility for the gamer than G-Sync around this VRR window. For both above and below the variable refresh area, AMD allows gamers to continue to select a VSync enabled or disabled setting. That setting will be handled as you are used to it today when your game frame rate extends outside the VRR window. So, for our 34UM67 monitor example, if your game is capable of rendering at a frame rate of 85 FPS then you will either see tearing on your screen (if you have VSync disabled) or you will get a static frame rate of 75 FPS, matching the top refresh rate of the panel itself. If your game is rendering at 40 FPS, lower than the minimum VRR window, then you will again see the result of tearing (with VSync off) or the potential for stutter and hitching (with VSync on).

So let’s talk about that. Though it doesn’t exist, imagine if you would a matching 2560×1080 75 Hz monitor with G-Sync support. When your game is capable of rendering above 75 Hz, say the same 85 FPS mentioned above, then you would be forced into a VSync enabled state. NVIDIA has currently decided that the experience of VSync on at the peak refresh rate of the monitor is the best experience for the gamer. On the AMD FreeSync monitor we have though you could instead choose to disable VSync, bringing about a return of horizontal screen tearing but giving you the lowest possible input latency and the highest possible rendered frame rates. It’s likely that many gamers would choose to disable VSync in this case; it’s the typical PC gamer configuration with standard displays and it’s also less likely that tearing will impact your experience when the frame rates and refresh rates are both high.

But what happens with this FreeSync monitor and theoretical G-Sync monitor below the window? AMD’s implementation means that you get the option of disabling or enabling VSync.  For the 34UM67 as soon as your game frame rate drops under 48 FPS you will either see tearing on your screen or you will begin to see hints of stutter and judder as the typical (and previously mentioned) VSync concerns again crop their head up. At lower frame rates (below the window) these artifacts will actually impact your gaming experience much more dramatically than at higher frame rates (above the window).

G-Sync treats this “below the window” scenario very differently. Rather than reverting to VSync on or off, the module in the G-Sync display is responsible for auto-refreshing the screen if the frame rate dips below the minimum refresh of the panel that would otherwise be affected by flicker. So, in a 30-144 Hz G-Sync monitor, we have measured that when the frame rate actually gets to 29 FPS, the display is actually refreshing at 58 Hz, each frame being “drawn” one extra instance to avoid flicker of the pixels but still maintains a tear free and stutter free animation. If the frame rate dips to 25 FPS, then the screen draws at 50 Hz. If the frame rate drops to something more extreme like 14 FPS, we actually see the module quadruple drawing the frame, taking the refresh rate back to 56 Hz. It’s a clever trick that keeps the VRR goals and prevents a degradation of the gaming experience. But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module.

This graph represents a scenario in a game where the frame rate is dropping over some amount of time, indicated by the blue line moving from right to left. It could be the gamer entering a very intense action scene, for example, with hardware that is seeing frame rates slowly dropping from ~40-50 FPS down to 10 FPS.

I made this diagram to demonstrate what we think is happening with G-Sync based on our testing. I should be clear that NVIDIA did not give us this information nor did it confirm if we are accurate or not. If correct though, as the game frame rate decreases it will double, triple, quadruple, etc. frame draws to maintain a draw rate over 30 Hz. There are interesting edges that occur with this method too though, including when to actually redraw an existing frame from local memory if another new frame from the GPU is "nearly" ready. It would appear that NVIDIA has been and will continue to tweak this algorithm in attempt to keep user experience optimal.

** Edit by Allyn **

Ryan is asleep after a long night of writing this article, but after spending more time tinkering with these panels back at home base, I have generated the following chart for your viewing pleasure:

I'll start by explaining FreeSync. The red dashed line represents when FreeSync is in its variable window (Ryan's 'Experience Zone'). For the BENQ panel, that zone is 40-144 Hz. Outside of that zone we shift to the orange dashed line. From observations and measurements I've taken, the BENQ panel 'sticks' at 40 Hz when game FPS levels drop <40 FPS. For that situation, the BENQ panel behaves like a fixed 40 Hz refresh rate display, and does what you would expect if V-Sync is on or off (judder or tearing). I will say that since it is refreshing at a relatively low rate, that the judder / tearing is more pronounced than it would be on a regular 60 Hz LCD. On the high end, the BENQ remains at 144 Hz when the game output is >144 FPS, also following the same V-Sync setting in the AMD driver, meaning it will either tear or judder, but at such high frame rates it is much harder to percieve either effect.

Now for G-Sync. I'll start with at the high end (black dashed line). With game output >144 FPS, an ROG Swift sticks at its rated 144 Hz refresh rate and the NVIDIA driver forces V-Sync on above that rate (not user selectable at present). This does produce judder, but it is hard to perceive at such a high frame rate (it is more of an issue for 4k/60 Hz G-Sync panels). The low end is where the G-Sync module kicks in and works some magic, effectively extending the VRR range (green line) down to handling as low as 1 FPS input while remaining in a variable refresh mode. Since LCD panels have a maximum time between refreshes that can not be exceeded without risk of damage, the G-Sync module inserts additional refreshes in-between the incoming frames. On current generation hardware, this occurs adaptively and in such a way as to minimize the possibility of a rendered frame colliding with a panel redraw already in progress. It's a timing issue that must be handled carefully, as frame collisions with forced refreshes can lead to judder (as we saw with the original G-Sync Upgrade Kit – since corrected on current displays). Further, the first transition (passing through 30 FPS on the way down) results in an instantaneous change in refresh rate from 30 to 60 Hz, which on some panels results in a corresponding change in brightness that may be perceptible depending on the type of panel being used and the visual acuity of the user. It's not a perfect solution, but given current panel technology, it is the best way to keep the variable refreshes happening at rates below the panel hardware limit.

Given that it is more likely for current games to dip less than 40 FPS than to exceed 144 FPS, and having witnessed both technologies first-hand, I personally find it extremely important to stay in a variable refresh mode at the low end of the LCD panel's variable range. I'll gladly take the potential slight flicker of G-Sync over the 40 Hz judder/tearing of the BENQ. The take home point from my observations is that when gaming lower than the variable range, FreeSync panels retain the disadvantages of the V-Sync on/off setting, but amplify those effects as the panel is refreshing at an even lower rate than a standard display (e.g. 60 Hz).

** End edit by Allyn **

That is the primary difference we see today between FreeSync and G-Sync. Could AMD implement a similar methodology for its systems without a module in the screen? Yes, I think it could, but the driver would have to have very detailed information about every AdaptiveSync panel on the market to make sure the experience is ideal in all cases. And when a new monitor is released, previous drivers would likely be unable to recreate a quality VRR experience without an update. The benefit of having the module in a G-Sync display is that the graphics card and driver doesn’t have to have any carnal knowledge of the panel’s technology.

Are There Performance Implications?

One interesting thing that AMD brought up in its FreeSync documentation is the presumption that NVIDIA G-Sync technology actually creates a performance deficit compared to a VSync disabled gaming configuration. In five key games, tested by AMD, the NVIDIA GTX 780 actually runs as much as 2.38% slower with a G-Sync enabled setup.

With AMD FreeSync though, the claim is a very slight (0.50%) improvement in performance compared to VSync off.

This is actually a confusing result. NVIDIA’s has often stated that G-Sync introduces no additional frame latency and should cause no performance deltas when compared to a standard monitor with VSync off as long as the frame rates don’t cross over the maximum refresh rate of the panels. Though the performance deltas that AMD claims exist on the NVIDIA solutions are basically within the margin of error for manual testing of gameplay, the fact that it exists at all is interesting and should be investigated. When talking with NVIDIA about this phenomenon while at GTC 2015, they seem to believe that we are looking at a bug in the driver and that the fundamental belief that there should be no performance deltas remains true.

Equally interesting in my mind is the very slight performance increase that AMD sees; they are definitely in the margin of error for testing but the results are consistently performance increases rather than half increases and half decreases. I see no way that AMD FreeSync could be producing a better frame rate than just VSync off as that setting is supposed to allow the rendering system to output results as quickly as possible. AMD is still looking into that as well, thinking it might be possible that a reduced memory bandwidth load in variable refresh states could be the cause.

I really don’t have any specific thoughts on this performance question today until I can do some more testing but I will also state that none of these performance deltas are likely severe enough to change a user’s mind on a platform selection.

Setting up FreeSync – As Simple as it Should Be

Once you purchase and connect your AMD FreeSync enabled monitor, setup is dead simple. As long as you have the latest AMD driver with FreeSync support, starting with 15.13 beta coming out today, you will receive a pop up when connecting a FreeSync capable display.

The pop up will prompt you to enter the Catalyst Control Center and enable the single check box required for FreeSync. That's it, you are up and running!

The LG 34UM67 does make you enable FreeSync in the panel on-screen display as well, though the Acer and BenQ we have tested don't have that same requirement.

« PreviousNext »