Your Mileage May Vary

It’s a workaround, but FreeSync displays can (for now) be used with NVIDIA GPUs

One of the most interesting things going around in the computer hardware communities this past weekend was the revelation from a user named bryf50 on Reddit that they somehow had gotten his FreeSync display working with his NVIDIA GeForce GPU. 

For those of you that might not be familiar with the particular ins-and-outs of these variable refresh technologies, getting FreeSync displays to work on NVIDIA GPUs is potentially a very big deal.

While NVIDIA GPUs support the NVIDIA G-SYNC variable refresh rate standard, they are not compatible with Adaptive Sync (the technology on which FreeSync is based) displays. Despite Adaptive Sync being an open standard, and an optional extension to the DisplayPort specification, NVIDIA so far has chosen not to support these displays.

However, this provides some major downsides to consumers looking to purchase displays and graphics cards. Due to the lack of interoperability, consumers can get locked into a GPU vendor if they want to continue to use the variable refresh functionality of their display. Plus, Adaptive-Sync/FreeSync monitors, in general, seem to be significantly more inexpensive for similar specifications.

Click here to continue reading our exploration into FreeSync support on NVIDIA GPUs!


The key to getting FreeSync working on an NVIDIA display lies in some technologies that aren't often discussed in a desktop context.

The initial discovery of this workaround by bryf50 on Reddit revolved around a newly added feature in the latest World of Warcraft expansion, Battle for Azeroth. In this new build of the game, the option to choose which graphics card the game would be rendered on was added.

Given that they had both an AMD Radeon Pro WX 4100 and a GTX 1080 Ti in their system, bryf50 decided to plug their FreeSync display into the Radeon Pro card with FreeSync enabled and switch the rendering GPU to the GTX 1080 Ti from within the game. Much to everyone's surprise, it appeared FreeSync was, in fact, working with the frames being rendered from the NVIDIA GPU.

At the end of his initial Reddit post, bryf50 begins theorizing that the newly added graphics switching features in the latest Windows 10 release, 1803, could be used to expand this sort of workaround to support more applications, but it appeared that with multiple discrete GPUs in a desktop, they could not customize which GPU was "High-Performance" and which was the "Power Saving" GPU.

A few days later, a user named survfate also on Reddit expanded this idea, and tried a similar setup with a Ryzen 3 2200G APU and GTX 1060. The key component here was the switchable graphics features of Windows 10 working as intended. The integrated Vega graphics (to which the FreeSync display was connected) was identified as the "Power Saving" GPU, and the GTX 1060 was identified as the "High-Performance GPU". Using this setup, survfate was able to get FreeSync working through the NVIDIA GPU in a variety of titles.

Seeing the success of these users (and others) on Reddit intrigued us, and we decided to take a deeper look at how exactly this works, and any potential downsides.

Switchable graphics, the idea of having two GPUs of varying performance levels in one system, has been a growing trend in the PC market. Through both introduction of external GPUs connected through Thunderbolt 3 for adding discrete graphics to thin-and-light notebooks and products like the Surface Book.

Given the popularity and that they themselves ship a product depending on this graphics switching functionality (Surface Book), it makes sense that Microsoft natively implemented this concept into the OS-level with Windows 10 version 1803.

However, the fact that this switchable graphics technology carries over to desktop systems running Windows 10 seems to be a potential boon for gamers. Through the use of a low-power integrated GPU such as the Vega 10 graphics in the Ryzen 5 2400G, Windows 10 is able to switch applications to render on a different GPU (in this case, a higher performance NVIDIA option), which still outputting through the display output on the integrated graphics.

To put this to the test, we assembled a testbed around the Ryzen 5 2400G as well as a GTX 1080. Our FreeSync display, the 2560×1400 144Hz Nixeus EDG-27, was connected through the DisplayPort output on the motherboard, allowing us to enable FreeSync in the AMD Radeon Settings application.

From there, you can right click on the exe of a graphics application (seems to be DirectX only), and select which graphics process you want to render the application on. Since we were in a device without a battery, the default GPU was the "High-Performance" GTX 1080.

From here, everything worked without any further configuration. Launching both games, as well as the FreeSync and G-SYNC demo applications, we saw variable refresh working to it's fullest extent.

Since this functionality relies on copying the rendered frame out of the frame buffer on the NVIDIA GPU and sending it to the AMD GPU for final output, one of our biggest potential concerns was if this added a substantial amount of latency or not.

Through the use of a microcontroller (Arduino Leonardo), we decided to put this through the test. With the Arduino emulating a left mouse click, we were able to run Half Life 2 (picked because of its built-in frame rate limiter), and measure input latencies.

When a push button connected to the Arduino is pressed, it simultaneously lights up a connected LED, as well as sends the mouse command to the operating system. From here, we use a high-speed camera recording at 1000 FPS (Allyn's Sony RX10M4) to monitor both the screen and the LED in the same frame.

Looking back at the footage, we are then able to calculate how many frames lapsed between the LED connected to the Arduino lighting up, and the muzzle flash on the screen (corresponding to the left mouse click signal the PC received). From here, we had a baseline to compare the game rendering on the AMD integrated graphics versus the NVIDIA GTX 1080 through the FreeSync display, as well as a G-SYNC display (the ASUS ROG Swift PG27Q) with similar specifications hooked up natively to the NVIDIA GPU.

These latency results were average across 14 trials for each test setup and compared to the other test setups.

It's difficult to compare these numbers directly due to the differences in relative GPU power and the different displays, but the overall result 32ms of input-to-photon latency for this workaround isn't a dealbreaker.

Despite the inevitable small increase in latency from copying the frame across GPUs, this solution, in its current state is still quite playable. Most gamers will find it impossible to notice an additional 3ms of input latency (compared to just the APU standalone) unless they are playing very high-frame rate competitive games like Counter-Strike: Global Offensive, in which you probably don't need to worry about variable refresh anyway.

Considering the almost $300 price difference between the Nixeus EDG-27 and the ASUS PG279, despite their very similar 2560×1440 144Hz IPS panels, I could see some gamers being able to trade a little bit of additional input latency for the cost savings.

However, the backbone of this setup currently hinges on a user having an APU, which obviously not everyone does. If someone could figure out how to configure the devices for switchable graphics in Windows 10 (potentially through a registry edit), then this could potentially be opened up to users who add an inexpensive, low-power AMD discrete GPU to their system and use that card solely for the FreeSync display output. This could even potentially be reversed to make AMD GPUs output to a G-SYNC display in theory, with the right pieces in place.

Since this isn't supported functionality, we have no idea if this will break with future/different driver revisions on either the NVIDIA or AMD side, so your experience may vary.

It's also worth noting that some games, like the aforementioned World of Warcraft, allow the rendering GPU to be switched in-game, which would mean this would work on two discrete GPU solutions. While games enabling this functionality are currently few and far between (Far Cry 5 is a notable recent title that allows it), it seems to be a feature that is growing in popularity.

Given that this is a core feature of Windows 10, it will be interesting to see NVIDIA's reaction to this workaround. While they might just let it slide, given the current limitations of needing an APU, it seems likely they will patch this workaround out via their driver (if possible) or work with Microsoft on a Windows 10 patch. This is analogous to NVIDIA in the past disallowing GeForce GPUs to be used as PhysX accelerators when an AMD GPU was detected.

Regardless, I wouldn't run out and buy an APU solely to use a FreeSync display on NVIDIA GPUs, but if you have the hardware it seems to work fine from our experience. This has been a fun glimpse into the future that could have been regarding NVIDIA supporting Adaptive-Sync, and I hope it's not the last.