Your Mileage May Vary
It’s a workaround, but FreeSync displays can (for now) be used with NVIDIA GPUs
One of the most interesting things going around in the computer hardware communities this past weekend was the revelation from a user named bryf50 on Reddit that they somehow had gotten his FreeSync display working with his NVIDIA GeForce GPU.
For those of you that might not be familiar with the particular ins-and-outs of these variable refresh technologies, getting FreeSync displays to work on NVIDIA GPUs is potentially a very big deal.
While NVIDIA GPUs support the NVIDIA G-SYNC variable refresh rate standard, they are not compatible with Adaptive Sync (the technology on which FreeSync is based) displays. Despite Adaptive Sync being an open standard, and an optional extension to the DisplayPort specification, NVIDIA so far has chosen not to support these displays.
However, this provides some major downsides to consumers looking to purchase displays and graphics cards. Due to the lack of interoperability, consumers can get locked into a GPU vendor if they want to continue to use the variable refresh functionality of their display. Plus, Adaptive-Sync/FreeSync monitors, in general, seem to be significantly more inexpensive for similar specifications.
Click here to continue reading our exploration into FreeSync support on NVIDIA GPUs!
The key to getting FreeSync working on an NVIDIA display lies in some technologies that aren't often discussed in a desktop context.
The initial discovery of this workaround by bryf50 on Reddit revolved around a newly added feature in the latest World of Warcraft expansion, Battle for Azeroth. In this new build of the game, the option to choose which graphics card the game would be rendered on was added.
Given that they had both an AMD Radeon Pro WX 4100 and a GTX 1080 Ti in their system, bryf50 decided to plug their FreeSync display into the Radeon Pro card with FreeSync enabled and switch the rendering GPU to the GTX 1080 Ti from within the game. Much to everyone's surprise, it appeared FreeSync was, in fact, working with the frames being rendered from the NVIDIA GPU.
At the end of his initial Reddit post, bryf50 begins theorizing that the newly added graphics switching features in the latest Windows 10 release, 1803, could be used to expand this sort of workaround to support more applications, but it appeared that with multiple discrete GPUs in a desktop, they could not customize which GPU was "High-Performance" and which was the "Power Saving" GPU.
A few days later, a user named survfate also on Reddit expanded this idea, and tried a similar setup with a Ryzen 3 2200G APU and GTX 1060. The key component here was the switchable graphics features of Windows 10 working as intended. The integrated Vega graphics (to which the FreeSync display was connected) was identified as the "Power Saving" GPU, and the GTX 1060 was identified as the "High-Performance GPU". Using this setup, survfate was able to get FreeSync working through the NVIDIA GPU in a variety of titles.
Seeing the success of these users (and others) on Reddit intrigued us, and we decided to take a deeper look at how exactly this works, and any potential downsides.
Switchable graphics, the idea of having two GPUs of varying performance levels in one system, has been a growing trend in the PC market. Through both introduction of external GPUs connected through Thunderbolt 3 for adding discrete graphics to thin-and-light notebooks and products like the Surface Book.
Given the popularity and that they themselves ship a product depending on this graphics switching functionality (Surface Book), it makes sense that Microsoft natively implemented this concept into the OS-level with Windows 10 version 1803.
However, the fact that this switchable graphics technology carries over to desktop systems running Windows 10 seems to be a potential boon for gamers. Through the use of a low-power integrated GPU such as the Vega 10 graphics in the Ryzen 5 2400G, Windows 10 is able to switch applications to render on a different GPU (in this case, a higher performance NVIDIA option), which still outputting through the display output on the integrated graphics.
To put this to the test, we assembled a testbed around the Ryzen 5 2400G as well as a GTX 1080. Our FreeSync display, the 2560×1400 144Hz Nixeus EDG-27, was connected through the DisplayPort output on the motherboard, allowing us to enable FreeSync in the AMD Radeon Settings application.
From there, you can right click on the exe of a graphics application (seems to be DirectX only), and select which graphics process you want to render the application on. Since we were in a device without a battery, the default GPU was the "High-Performance" GTX 1080.
From here, everything worked without any further configuration. Launching both games, as well as the FreeSync and G-SYNC demo applications, we saw variable refresh working to it's fullest extent.
Since this functionality relies on copying the rendered frame out of the frame buffer on the NVIDIA GPU and sending it to the AMD GPU for final output, one of our biggest potential concerns was if this added a substantial amount of latency or not.
Through the use of a microcontroller (Arduino Leonardo), we decided to put this through the test. With the Arduino emulating a left mouse click, we were able to run Half Life 2 (picked because of its built-in frame rate limiter), and measure input latencies.
When a push button connected to the Arduino is pressed, it simultaneously lights up a connected LED, as well as sends the mouse command to the operating system. From here, we use a high-speed camera recording at 1000 FPS (Allyn's Sony RX10M4) to monitor both the screen and the LED in the same frame.
Looking back at the footage, we are then able to calculate how many frames lapsed between the LED connected to the Arduino lighting up, and the muzzle flash on the screen (corresponding to the left mouse click signal the PC received). From here, we had a baseline to compare the game rendering on the AMD integrated graphics versus the NVIDIA GTX 1080 through the FreeSync display, as well as a G-SYNC display (the ASUS ROG Swift PG27Q) with similar specifications hooked up natively to the NVIDIA GPU.
These latency results were average across 14 trials for each test setup and compared to the other test setups.
It's difficult to compare these numbers directly due to the differences in relative GPU power and the different displays, but the overall result 32ms of input-to-photon latency for this workaround isn't a dealbreaker.
Despite the inevitable small increase in latency from copying the frame across GPUs, this solution, in its current state is still quite playable. Most gamers will find it impossible to notice an additional 3ms of input latency (compared to just the APU standalone) unless they are playing very high-frame rate competitive games like Counter-Strike: Global Offensive, in which you probably don't need to worry about variable refresh anyway.
Considering the almost $300 price difference between the Nixeus EDG-27 and the ASUS PG279, despite their very similar 2560×1440 144Hz IPS panels, I could see some gamers being able to trade a little bit of additional input latency for the cost savings.
However, the backbone of this setup currently hinges on a user having an APU, which obviously not everyone does. If someone could figure out how to configure the devices for switchable graphics in Windows 10 (potentially through a registry edit), then this could potentially be opened up to users who add an inexpensive, low-power AMD discrete GPU to their system and use that card solely for the FreeSync display output. This could even potentially be reversed to make AMD GPUs output to a G-SYNC display in theory, with the right pieces in place.
Since this isn't supported functionality, we have no idea if this will break with future/different driver revisions on either the NVIDIA or AMD side, so your experience may vary.
It's also worth noting that some games, like the aforementioned World of Warcraft, allow the rendering GPU to be switched in-game, which would mean this would work on two discrete GPU solutions. While games enabling this functionality are currently few and far between (Far Cry 5 is a notable recent title that allows it), it seems to be a feature that is growing in popularity.
Given that this is a core feature of Windows 10, it will be interesting to see NVIDIA's reaction to this workaround. While they might just let it slide, given the current limitations of needing an APU, it seems likely they will patch this workaround out via their driver (if possible) or work with Microsoft on a Windows 10 patch. This is analogous to NVIDIA in the past disallowing GeForce GPUs to be used as PhysX accelerators when an AMD GPU was detected.
Regardless, I wouldn't run out and buy an APU solely to use a FreeSync display on NVIDIA GPUs, but if you have the hardware it seems to work fine from our experience. This has been a fun glimpse into the future that could have been regarding NVIDIA supporting Adaptive-Sync, and I hope it's not the last.
“Switchable graphics, the
“Switchable graphics, the idea of having two GPUs of varying performance levels in one system, has been a growing trend in the PC market.”
That’s been around for ages on Laptops for switching between the Integrated Graphics and The Discrete Mobile GPU.
I’m also thinking that it has somthing to do with windows 10 and DX12’s Explicit Heterogeneous Multi-GPU adaptor in the DX12 API that can allow for 2 or more GPUs from any maker to be used for gaming. Maybe that’s to do with DX12’s knowing which GPU that the user has selcted as the main GPU to drive the display and any other GPU plugged into the PC works for gaming also with the final output/Frame Buffer being driven from the GPU with a display attatched.
The Vulkan graphics API does not currently support Explicit heterogeneous Multi-GPU adaptor like DX12 currently does as Vulkan only supports homogenious multi-GPU for multi GPUs of the same make. So its maybe DX12’s heterogeneous multi-GPU adaptor ability that is what is the games are calling on and not any CF/SLI(deprecated functionality).
Nvidia should not be able to disable this Windows 10/DX12 functionality as that’s part of MS’s DX12 standard that the games makers can call up via the DX12 Graphics API. That should also allow for 2 GTX 1060’s to be used at tha same time on either DX12 or Vulkan(homogenious multi-GPU). This is a graphics API Feature that should not be able to be overridden by the DX12 enabled Driver on Nvidia’s cards. The DX12 driver is supposed to be only exposing the respective GPU maker’s Bare metal functionality so the DX12 driver is a sempifiled driver anyways, ditto for the Vulkan Graphics API GPU drivers.
Khronos is supposed to be working on getting Vulkan’s heterogeneous multi-GPU adaptor working als so users can make use of both Nvidia’s and AMD’s GPUs in a similar fashon.
All I can say is this is
All I can say is this is basically embarrassing for Nvidia, so driver patch will be out very soon to address some “stability” issues and remove this option.
But in the end it is just nice to see that technically it is possible, as many have expected. Sad that it will be gone soon 🙁
I don’t understand why do you
I don’t understand why do you think it’s embarrassing?
this solution is not running freesync on the Geforce, it’s the Radeon IGP which is doing that, it’s just displaying the frames rendered by the geforce at the correct rate on the radeon and sending it with freesync to the display…
How is Nvidia supposed to get
How is Nvidia supposed to get around this when both the DX12 and Vulkan graphics API’s are supposed to make use of that close to the metal simplified driver model that the DX12/Vulkan graphics APIs are designed to make use of.
I mean really does Nvidia expect that M$ will just roll over and play dead with that, publicly declared by M$, DX12 feature of Explicit heterogeneous GPU multi-adaptor capability that’s been available for games/gaming engine makers to make use of for some while now. M$ is the one that certifies each and every GPU that’s able to be used under its Windows OS and that DX12 API and M$’s Windows Display Driver Model (WDDM), ditto for the Linux Kernel and its DRM/Whatever the DC/Other Subsystem thingy that its Kernel uses via the Vulkan or OpenGL/CL API/driver model is called.
So what’s Nvidia going to do, have its GPU drivers look to see if there is a Non Nvidia GPU dectected and will Nvidia have its GPUs disable themselves. And How will Nvidia expect Intel and AMD to act if they decide to not whitelist any of Nvidia’s cards on dection by AMD’s/Intel’s respective CPU drivers for any Nvidia cards not whitelisted as compatable. What about Integrated AMD or Intel Graphics and that graphics able to drive a display output.
Technically any DX12 Explicit heterogeneous GPU multi-adaptor capability should allow a GPU with physically no video outputs available to be paired with any make of GPU that does physically have display video outputs and use both GPUs for gaming.
Really this Explicit heterogeneous GPU multi-adaptor capability should have been the way M$’s OSs where designed from the get go with the GPU hardware totally subservient to the OS and that OS’s graphics API/Graphics driver subsystem model. Who the hell in the original mainframe OS world could have ever possibly conceived of any other way for a proper OS to not have had this capability from day one.
The Proper OS is supposed to be in control and any hardware and display adaptors(GPUs/Other) are supposed to be totally subservient to the OS that Runs on the CPU! Unless you are in micropeocessor PC OS Bizarro World where the OSs on microprocessors have not really ever been proper OSs in the Mainframe OS sense of what a proper OS should be.
GPU makers should be Told to STFU and only supply the bare metal drivers that expose the GPUs basic BARE METAL functionality to the OS and then get the F–k out of the way of the OS/Graphics API and games/gaming engine software. GPU’s should only do what is commanded of the GPU/Display Adaptor by the OS/Graphics API and Game/Gaming engine software.
Really WTF sort of Twisted Millennial psychopathically generated narcissistically oriented mindset came up with the crazy notion that the GPU should be calling the shots when it is the OS running on the CPU that’s the only reason any display adaptor(GPU/Otherwse) is able to run on a computer.
WTF do the GPU makers think that thay are, and in AMD’s case WTF does that RTG division scum think that it is.
You Millennials are some very twisted folks if you think that any Display adaptor/Multi-Display Adaptors(GPU/Otherwise) should not be available for games/other usage no mattter who makes that Goddamn F-ing GPU.
M$ with its OSs(Crappy OS products from day one for around 40 years now) has just now gotten around to making its OS properly control the hardware plugged into the damn PCIe slots. Really just over the last few years of DX12 and it took M$ that F–king long to get those GPU in their properly subservient place and doing what the OS/Graphics APIs F-ing command.
Really you Millennials all need to be on that B-ARK Space Tub with the controls set for the heart of the Sun.
.
.
.
Little by little the night turns around.
Counting the leaves which tremble at dawn.
Lotuses lean on each other in yearning.
Under the eaves the swallow is resting.
Set the controls for the heart of the sun.
Over the mountain watching the watcher.
Breaking the darkness
Waking the grapevine.
One inch of love is one inch of shadow
Love is the shadow that ripens the wine.
Set the controls for the heart of the sun.
The heart of the sun, the heart of the sun.
Witness the man who raves at the wall
Making the shape of his questions to Heaven.
Whether the sun will fall in the evening
Will he remember the lesson of giving?
Set the controls for the heart of the sun.
The heart of the sun, the heart of the sun.
You sound mad af
You sound mad af
Mad “af” at those damn
Mad “af” at those damn Millennial Phab Phone FondleSlabers and their Short Attention Spans with no sense of history. That Millennial cohort and their damn weird tendency of wandering about underfoot with their attention(Short as it is) mostly focuesd on their Phab Phone’s little rectangular screen and so very damn oblivious to their surroundings.
Look at all the little Millennials so clueless as they are twittering on about this and that using their Phab Phone FondleSlabing finger gesticulations. Oh look at how very addictively those Millennial twerps text their chirpings so very hell bent with that narcissistic need of earning some twittering twerp following!
Damn Ya Millennial Twerps! clogging up the grocery store aisles in the way and so very damned hypnotized by those Phab Phone FondleSlab Svengali devices. It’s bad enough that the damn little Millennial corporate grocery store Management Twerps have gutted the store’s inventories of affordable staple goods and replaced that with some Mad Millennial Menagerie of specialized, via short trendy Millennial attention spans, “food” of the day crap that’s not fit for a maggot.
It’s best not to read the
It’s best not to read the comments section on PCPer any more as they’re normally filled with versions of war and peace from that guy.
Good finally it pissed you
Good finally it pissed you off, and that’s why I am here! It’s just to piss you “WhyMe” off! Lots-O-Text in fact Walls-O-Text and specifically designed to PISS YOU OFF!
You Damn Millennial Inbreeder, How’s your Swife Brandine and that brood of Gilled/Webbed Appendaged ChildraNewts?
Your bloodline de-evolves with each passing generation, on the way to becoming single celled pond scum.
Really nice piece, good of
Really nice piece, good of you guys to be monitoring reddit for this type of stuff.
To anyone interested in input lag in conjunction to g-sync free sync this website really hits the analysis out of the park: https://www.blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/
How long does NVIDIA think
How long does NVIDIA think that G-Sync can compete against the combined might of AMD FreeSync, VESA AdaptiveSync, and HDMI VRR?
Sooner or later, whether or not G-Sync is truly “teh best”, they have to offer solutions to connect to all the new TVs and monitors that support these alternatives. I’m sure Nintendo Switch owners would like the option of connecting to HDMI VRR-capable TVs, right?
They can do it as long as
They can do it as long as they have a massive performance lead over AMD.
Why do you put AMD FreeSync
Why do you put AMD FreeSync in list? VESA AdaptiveSync, HDMI VRR are standards. FreeSync is gimmick.
Freesync is exactly VESA
Freesync is exactly VESA DisplayPort AdaptiveSync(TM) and Freesync came from VESA’s eDP laptop screen variable refresh standard. AMD being a member of VESA, Nvidia/others also, and on many VESA working Committees, ditto for Nvidia/Others, that AMD could very well damn skippy ask VESA to take that eDP standard and extend that into the VESA 1.2a DisplayPort AdaptiveSync(TMed name by F-ing VESA)standard. And AMD could have call it Macaroni but AMD’s marketing monkeys decided on FreeSync(Danm Millennial Marketing gimmick!) but its not really anything but VESA’s VESA DisplayPort AdaptiveSync(TM)
G-Damn F–K you Millennials(not necessarily you the one whose post that I am trolling) you can not possibly be that Daft about how the independent industry Standards Bodies like VESA, USB-IF/Others work and that these independent industry Standards Bodies are made up of the very same CPU/GPU/Other processor and display companies.
WTF is with you damn Millennials and your Sriracha Sauce filled heads! Those Synapses of yours are so very picante pickled as to be totally worthless for any cerebral uasge but will do fine for that Zombie wrap special at Trending Mendy’s Millennial Wrap Shack!
Watch out for that Danm Bus you distracted Millennial Multi-Tasking Segway Scooting Phab-Slab fondler! Oh too late…
Oh, this is going to make
Oh, this is going to make that dream machine so much cheaper.
Your Milage May Vary.
What is
Your Milage May Vary.
What is milage? A spellcheck fail.
I doubt Nvidia would disable
I doubt Nvidia would disable G-Sync for everyone with an AMD CPU with integrated graphics. It’s one thing to block PhysX when people are using a low end PhysX card with a main AMDA card, but this would be the opposite.
More interesting would be if Intel enables freesync on their integrated graphics like they keep saying they’re going to.
You should have a look at
You should have a look at over at Level1Techs and their coverage of GPU passthrough into a virtual machine using “their” Looking Glass software, apparently you can pass through an Nvidia GPU to the VM, copy the frame buffer over through Looking Glass into and AMD GPU used for the display of the host system. The guest OS will only ever see the one Nvidia GPU, so I don’t think Nvidia could do anything to prevent this even if they wanted to.
I see nVidia shutting this
I see nVidia shutting this down. They have too much invested in G-SYNC to allow this. Display manufacturers also have something to lose as they are invested in G-SYNC and have stock of G-SYNC monitors that will suddenly be unattractive to those of us who like to save money.
So a company is allowed to
So a company is allowed to disable a feature, which isnt even a security threat, on my owned hardware without my consent. Which laws allow for this and I wonder if NVidia could be sued if they go ahead?
You consent to it when you
You consent to it when you install the update ya ding dong.
This is like when I’m using
This is like when I’m using Conservative Morphological AA override in intel igpu and using my 960m on my laptop for gaming. Intel gpu handles AA and 960m handles rendering. It works for me in some games, but it also generates a lot of heat.