What is FreeSync?

We have our first set of AMD FreeSync monitors in the office and are ready to talk about the variable refresh experience they offer.

FreeSync: What began as merely a term for AMD’s plans to counter NVIDIA’s launch of G-Sync (and mocking play on NVIDIA’s trade name) has finally come to fruition, keeping the name – and the attitude. As we have discussed, AMD’s Mantle API was crucial to pushing the industry in the correct and necessary direction for lower level APIs, though NVIDIA’s G-Sync deserves the same credit for recognizing and imparting the necessity of a move to a variable refresh display technology. Variable refresh displays can fundamentally change the way that PC gaming looks and feels when they are built correctly and implemented with care, and we have seen that time and time again with many different G-Sync enabled monitors at our offices. It might finally be time to make the same claims about FreeSync.

But what exactly is FreeSync? AMD has been discussing it since CES in early 2014, claiming that they would bypass the idea of a custom module that needs to be used by a monitor to support VRR, and instead go the route of open standards using a modification to DisplayPort 1.2a from VESA. FreeSync is based on AdaptiveSync, an optional portion of the DP standard that enables a variable refresh rate courtesy of expanding the vBlank timings of a display, and it also provides a way to updating EDID (display ID information) to facilitate communication of these settings to the graphics card. FreeSync itself is simply the AMD brand for this implementation, combining the monitors with correctly implemented drivers and GPUs that support the variable refresh technology.

A set of three new FreeSync monitors from Acer, LG and BenQ.

Fundamentally, FreeSync works in a very similar fashion to G-Sync, utilizing the idea of the vBlank timings of a monitor to change how and when it updates the screen. The vBlank signal is what tells the monitor to begin drawing the next frame, representing the end of the current data set and marking the beginning of a new one. By varying the length of time this vBlank signal is set to, you can force the monitor to wait any amount of time necessary, allowing the GPU to end the vBlank instance exactly when a new frame is done drawing. The result is a variable refresh rate monitor, one that is in tune with the GPU render rate, rather than opposed to it. Why is that important? I wrote in great detail about this previously, and it still applies in this case:

The idea of G-Sync (and FreeSync) is pretty easy to understand, though the implementation method can get a bit more hairy. G-Sync (and FreeSync) introduces a variable refresh rate to a monitor, allowing the display to refresh at wide range of rates rather than at fixed intervals. More importantly, rather than the monitor dictating what rate this refresh occurs at to the PC, the graphics now tells the monitor when to refresh in a properly configured G-Sync (and FreeSync) setup. This allows a monitor to match the refresh rate of the screen to the draw rate of the game being played (frames per second) and that simple change drastically improves the gaming experience for several reasons.

Gamers today are likely to be very familiar with V-Sync, short for vertical sync, which is an option in your graphics card’s control panel and in your game options menu. When enabled, it forces the monitor to draw a new image on the screen at a fixed interval. In theory, this would work well and the image is presented to the gamer without artifacts. The problem is that games that are played and rendered in real time rarely fall into a very specific frame rate. With only a couple of exceptions, games frame rates will fluctuate based on the activity happening on the screen: a rush of enemies, a changed camera angle, an explosion or falling building. Instantaneous frame rates can vary drastically, from 30, to 60, to 90, and force the image to be displayed only at set fractions of the monitor's refresh rate, which causes problems.

If a frame takes more than the standard refresh time of a monitor to draw (if the frame rate drops low) then you will see a stutter, or hitch, in the game play that is caused by the display having to re-draw the same frame for a second consecutive interval. Any movement that was tracking on the screen would suddenly appear to stop – and then would quickly “jump” to the next location faster than your mind thought it should. V-Sync also inherently introduces input latency to games if these lag and stutters take place.

The common alternative for gamers worried about latency and stutter was to disable V-Sync in the control panel or in game. This solves the stutter and latency issues (kind of) but cause another, much more noticeable issue, called "tearing". With V-Sync disabled, the graphics card sends a new frame to the monitor at any point in the monitors refresh cycle, even if the LCD is currently drawing. The result is that, at some point down the screen, the user sees the previous frame above, and the current frame below, at a sharp line. You will literally be seeing an image where the geometry is no longer lined up and, depending on the game and scene, can be incredibly distracting.

Monitors with refresh rates higher than 60 Hz reduce this tearing by having more frequent screen refreshes, and thus a tear is less likely to occur in any single refresh cycle, but the tearing is impossible to remove completely.

NVIDIA G-Sync (and FreeSync) switches things up by only having the monitor refresh its screen when a new frame is ready from the GPU. As soon as the next frame is drawn it can be based to the display and drawn on the screen without tearing. If the next frame is ready in 16ms, it can be sent immediately. If it takes 25ms or only 10ms, it doesn’t matter, the monitor will wait for information from the GPU to draw the new frame. The result is an incredibly smooth and fluid animation that doesn’t stutter and doesn’t tear.

So since we know what G-Sync provides, what does AMD FreeSync want to do differently? The company hangs its hat on three distinct keys: no proprietary hardware, no closed standards and no licensing fees. Let’s address the licensing fee first. I have it from people that I definitely trust that NVIDIA is not charging a licensing fee to monitor vendors that integrate G-Sync technology into monitors. What they do charge is a price for the G-Sync module, a piece of hardware that replaces the scalar that would normally be present in a modern PC display. It might be a matter of semantics to some, but a licensing fee is not being charged, as instead the vendor is paying a fee in the range of $40-60 for the module that handles the G-Sync logic.

That leads us to AMD’s claims that FreeSync doesn’t require proprietary hardware, and clearly that is a fair and accurate benefit of FreeSync today. Displays that support FreeSync can use one of a number of certified scalars that support the DisplayPort AdaptiveSync standard. The VESA DP 1.2a+ AdaptiveSync feature is indeed an open standard, available to anyone that works with VESA while G-Sync is only offered to those monitor vendors that choose to work with NVIDIA and purchase the G-Sync modules mentioned above.

Unfortunately, not all AMD Radeon graphics cards today will be able to run FreeSync monitors in their variable refresh state. For discrete GPUs, only the R9 290X, R9 290, R9 285, R7 260X and R7 260 have the ability to properly communicate with the FreeSync displays and offer users the advantages of VRR. If you own an HD 7000 series card or even an R9 280 or R9 270, you are going to need to upgrade your GPU in addition to your monitor to take advantage of the technology. For NVIDIA users though, G-Sync is supported on GTX 900-series, 700-series and 600-series cards nearly across the board, in part thanks to the implementation of the G-Sync module itself.

Let’s take a look at this AMD-created table above that compares FreeSync to G-Sync. We have already discussed the module requirement, open standards and licensing fees, but what about the other areas that AMD claims to hold the advantage? Because FreeSync monitors will use standard scalars from existing companies, they will have the full gamut of features that you expect in an LCD monitor. That includes audio output, internal scaling (for resolution changes), color processing, more input options and more. It’s been obvious when reviewing G-Sync monitors that some of these concerns stand out – only getting a single DisplayPort input is frustrating because there will likely be cases where you want to use your VRR capable monitor in a non-VRR configuration with an HDMI cable or DVI connection. You cannot do that with current generation G-Sync displays, it’s DisplayPort or nothing. FreeSync monitors will continue to offer a range of input options at the discretion of the display vendor.

The published refresh rate is more than a bit misleading. AMD claims that FreeSync is rated at 9-240 Hz while G-Sync is only quoted at 30-144 Hz. There are two issues with this. First, FreeSync doesn’t determine the range of variable refresh rates, AdaptiveSync does. Second, AMD is using the maximum and minimum refresh range published by the standards body, not an actual implementation or even a planned implementation from any monitor or display. The 30-144 Hz rating for G-Sync is actually seen in shipping displays today (like the ASUS PG278Q ROG Swift). The FreeSync monitors we have in our office are either 40-144 Hz (TN, 2560×1440) or 48-75 Hz (IPS, 2560×1080); neither of which is close to the 9-240 Hz seen in this table.

We’ll touch on the implications of a performance degradation on G-Sync later in the story.

« PreviousNext »