It’s more than just a branding issue
Combine Ryan, Allyn and an analog oscilloscope – what do you get? SCIENCE!
As a part of my look at the first wave of AMD FreeSync monitors hitting the market, I wrote an analysis of how the competing technologies of FreeSync and G-Sync differ from one another. It was a complex topic that I tried to state in as succinct a fashion as possible given the time constraints and that the article subject was on FreeSync specifically. I'm going to include a portion of that discussion here, to recap:
First, we need to look inside the VRR window, the zone in which the monitor and AMD claims that variable refresh should be working without tears and without stutter. On the LG 34UM67 for example, that range is 48-75 Hz, so frame rates between 48 FPS and 75 FPS should be smooth. Next we want to look above the window, or at frame rates above the 75 Hz maximum refresh rate of the window. Finally, and maybe most importantly, we need to look below the window, at frame rates under the minimum rated variable refresh target, in this example it would be 48 FPS.
AMD FreeSync offers more flexibility for the gamer than G-Sync around this VRR window. For both above and below the variable refresh area, AMD allows gamers to continue to select a VSync enabled or disabled setting. That setting will be handled as you are used to it today when your game frame rate extends outside the VRR window. So, for our 34UM67 monitor example, if your game is capable of rendering at a frame rate of 85 FPS then you will either see tearing on your screen (if you have VSync disabled) or you will get a static frame rate of 75 FPS, matching the top refresh rate of the panel itself. If your game is rendering at 40 FPS, lower than the minimum VRR window, then you will again see the result of tearing (with VSync off) or the potential for stutter and hitching (with VSync on).
But what happens with this FreeSync monitor and theoretical G-Sync monitor below the window? AMD’s implementation means that you get the option of disabling or enabling VSync. For the 34UM67 as soon as your game frame rate drops under 48 FPS you will either see tearing on your screen or you will begin to see hints of stutter and judder as the typical (and previously mentioned) VSync concerns again crop their head up. At lower frame rates (below the window) these artifacts will actually impact your gaming experience much more dramatically than at higher frame rates (above the window).
G-Sync treats this “below the window” scenario very differently. Rather than reverting to VSync on or off, the module in the G-Sync display is responsible for auto-refreshing the screen if the frame rate dips below the minimum refresh of the panel that would otherwise be affected by flicker. So, in a 30-144 Hz G-Sync monitor, we have measured that when the frame rate actually gets to 29 FPS, the display is actually refreshing at 58 Hz, each frame being “drawn” one extra instance to avoid flicker of the pixels but still maintains a tear free and stutter free animation. If the frame rate dips to 25 FPS, then the screen draws at 50 Hz. If the frame rate drops to something more extreme like 14 FPS, we actually see the module quadruple drawing the frame, taking the refresh rate back to 56 Hz. It’s a clever trick that keeps the VRR goals and prevents a degradation of the gaming experience. But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module.
As you can see, the topic is complicated. So Allyn and I (and an aging analog oscilloscope) decided to take it upon ourselves to try and understand and teach the implementation differences with the help of some science. The video below is where the heart of this story is focused, though I have some visual aids embedded after it.
Still not clear on what this means for frame rates and refresh rates on current FreeSync and G-Sync monitors? Maybe this will help.
This graph shows typical (and most popular) 40-144 Hz panel implementations of each technology and the relationship between frame rate and refresh rate. The bottom axis shows the game's frame output rate, what would be reported by a program like Fraps. You can see that from ~40 FPS to 144 FPS, both technologies offer pure variable frame rate implementations where the refresh rate of the screen matches the game's frame rate. The quality and experience between the two technologies here are basically identical (and awesome). Above 144 FPS, both will go into a V-Sync state (as long as V-Sync is enabled on the FreeSync panel).
Below that 40 FPS mark though things shift. The red line shows how AMD's FreeSync and Adaptive Sync work: the refresh rate stays static at 40 Hz even as the frame rate dips below 40 FPS, to 35 FPS, 30 FPS, etc. G-Sync works differently, doubling the refresh rate and inserting duplicate frames starting at around the 37 FPS mark. This continues until the game frame rate hits 19/20 FPS where a third frame is inserted and the refresh rate is increased again. The result is the dashed line representing the effective experience of G-Sync.
Zoomed in on the area of interest, you get a better view of how G-Sync and FreeSync differ. Effectively, G-Sync has no bottom window for variable refresh and produces the same result as if the display technology itself was capable going to lower refresh rates without artifacting or flickering. It is possible that in the future, as display technologies improve, the need for this kind of frame doubling algorithm will be made unnecessary, but until we find a way to reduce screen flicker at low refresh rates, NVIDIA's G-Sync VRR implementation will have the edge for this scenario.
As we discuss in the video, it is possible that AMD could implement a similar algorithm for FreeSync at the driver level, without the need for an external module. A Radeon GPU knows what frame rate it is rendering at and it could send out a duplicate frame, at a higher frame rate, to trick the display and have the same effect. It will require a great deal of cooperation between the panel vendors and AMD however, as with each new monitor release AMD would have to have a corresponding driver or profile update to go along with it. That hasn't been AMD's strong suit in the past several years though, so it would require a strong commitment from them.
It's also important to note that the experience below the VRR window on a FreeSync panel today is actually worse in practice than in theory. Because the refresh rates stays at 40 Hz when your frame rates are low, you get a combination of stutter and frame tearing (if V-Sync is off) that is worse than if the refresh rate was higher, at say 60 Hz or even 144 Hz. No doubt complications would arise from an instantaneous refresh rate shift of ~40 Hz to 144 Hz but some middle ground likely exists that FreeSync could implement to improve low-FPS experiences.
I hope you found this story (and the video!) informative and interesting, we spent a lot of time gathering the data and figuring out how best to present it. Please leave us feedback in the comments here and we will try to answer as many question as well can.
Thanks for reading!
Nice job covering this topic.
Nice job covering this topic. I was wondering when you would share the “raw data” on your VRR investigating. This level of detail is why I started following your site.
I think given the higher prices of G-Sync displays due to supply and demand and the low refresh rate problems of Free-Sync, the smart choice is to wait on VRR at the moment. In 6-12 months, the 3440x1440p IPS 144Hz VRR monitors or 2560×1440 versions will be out. Also the prices should drop by then. I think that until Free-Sync is supported in EVERY monitor, you’re better off waiting until the low fps problem is fixed. If you’re spending more on VRR, why settle for a sub-optimal implementation when it will likely be fixed later.
The raw data is not very good
The raw data is not very good for anything other than a chart because the thresholds actually shift based on the rate of change of FPS. G-Sync module does its best to get ahead of any frame rate changes in order to minimize any possible of judder caused by a frame incoming during a redraw. My raw data is static in nature while the actual progression is very dynamic.
at the end of the video you
at the end of the video you had a Theoretical discussion on what AMD could do to Driver update to fix the what happens under the Low side of the Monitor.
you (both) mentioned that AMD would have to do a Driver Update for each monitor that would come out in order to support the Frame Multiplication feature and (theoretically) it would not have to be for each monitor, in the same way that the driver doesn’t need to be update for each monitor currently, just that it needed to support VRR/FreeSync/Adaptive-Sync.
point being that when the monitor at start up it reports the VRR Window, and all the driver would have to do is set a FPS of when to start multiplying frames to alleviate the issue.
also, if AMD Does do this, FreeSync labeled monitor should have a VRR cutoff that is less than half of the Max refresh of the monitor (or a half minus 1 or 2 to make sure that it doesn’t tear). with the LG ultra wide with the 45-75hz, the Frame Multiplier would put the frame to 88 (from 44) above the Max Refresh of the monitor causing tearing.
Mr Malventano I do believe
Mr Malventano I do believe you’re incorrect in that video when you state that gsync is pacing the low fps in such a manner as you describe. Here is why: The whole point of doubling or tripling the refresh he is to minimize not just flickering, but also the amount of time the next frame in our animated sequence has to wait before it’s displayed. An example would be say at 30 fps, instead of scanning out every 33.3ms, double-up and scanout every 16.7ms. This has the effect of reducing the maximun amount of time the next frame has to wait before being shown from 33.3ms in my example to a maximum 16.7ms. There’s no clever algorithm pacing something as unpredictable as the when next frame is coming. This fits in with what Mr Petersn said about gsync when he likened it to double buffering, rendering into A and B buffer and scanning alternately out of said buffers.
Incorrect. It paces the frame
Incorrect. It paces the frame insertions centered within the incoming frames, as is plainly visible on the scope. If they were as you describe, they would not be evenly spaced.
It’s a kind of adaptive
It’s a kind of adaptive variable v-sync, doubling at 36fps and below and tripling at 18fps and below which explains the even spacing. If it is as you say then it’s got to know ahead of time when the incoming frames are going to be and this points to additional buffering. Have you any latency testing in the offing?
For gsync are you using the
For gsync are you using the ROG swift or the Acer panel? The video says you are using the swift, but the graph shows the Acer. If it is the Acer, is it the new IPS gsync monitor? Love to see a review on that.
Its the new Acer XB270HU IPS
Its the new Acer XB270HU IPS 144Hz monitor. Don’t bother trying to get one right now, they are back order till the end of NEXT month.
I got my preordered. :/
We actually tested both and
We actually tested both and both behaved the same, moving into the frame doubling windows at the same points.
I don’t understand the
I don’t understand the thought process behind AMD requiring a profile for every monitor if they want to fix this problem.
The driver should already be given the minimum refresh rate of the monitor. After that it’s just as simple as redrawing the last frame at a refresh rate divisible by the FPS between the minimum and maximum until the new frame is ready.
I agree with that. Quick
I agree with that. Quick question, wasn’t gsync vsyncing at 30 fps and below before this doubling an tripling of refreshes was introduced a couple of months later?
No, but G-Sync did have an
No, but G-Sync did have an issue where it didn't handle quick oscillations around those frame doubling transition points very well, causing a kind of stutter when you were rendering at 36-39 FPS for a while.
That actually was more my old
That actually was more my old oscilloscope having a hard time triggering. It's an old scope. Cut it some slack 🙂
As we discuss in the video,
As we discuss in the video, it is possible that AMD could implement a similar algorithm for FreeSync at the driver level, without the need for an external module. A Radeon GPU knows what frame rate it is rendering at and it could send out a duplicate frame, at a higher frame rate, to trick the display and have the same effect. It will require a great deal of cooperation between the panel vendors and AMD however,
I’m not sure about it requiring a great deal of cooperation with panel vendors.If freesync truly is the mono-directional protocol they say it is,if the monitor is truly slave to the gpu and it does as it’s told, then the monitor should simply scan out whatever frame is sent regardless of whether it’s the same frame or not.The ghosting is what’s going to require a lot of cooperation to get sorted. Interestingly, the higher you fps on freesync, the less your ghosting – so doubling-up or even tripling the refresh rate at low fps is definitely something AMD needs to look at implementing.
While INCREDIBLY tedious to
While INCREDIBLY tedious to test, it would be VERY interesting to see the results of running a colorimeter (or spectrophotometer if available) on a G-Sync and DP Adaptive sync display at different refresh rates, as well as hooking up the oscilloscope and measuring ghosting at different refresh rates. If the G-sync module is doing some dynamic variation in panel driving voltage/time which commodity panel controllers are not, then there may be variations in brightness and contrast (or even a slight colour shift) when driving a DP Adaptive Sync display at different refresh rates.
I was thinking a similar
I was thinking a similar thought. It seems like something like a 144hz panel would have a minimum, a maximum (144hz), and possibly a preferred. Given how gsync seems to work, it seems like you could set a preferred refresh rate (maybe 120 hz) that would give you good color accuracy and less overdrive artifacts and just have the gpu send extra frames when necessary. The gsync system has this behavior below the panels minimum refresh rate, so why not set the target refresh rate higher.
Interesting. We’ll try to
Interesting. We'll try to discuss the testing process for something like this…
Color variations are not
Color variations are not really suspect, because even though both panels are in VRR mode, the draw speed is still at the max rate (each frame is drawn at 1/144 top to bottom). Color consistency issues normally present when changing to higher scan rates, not differing refresh rates – not in this context at least.
Awesome work, guys!
Awesome work, guys!
One of the things I noticed
One of the things I noticed initially with gsync was that if I turned vsync on in some games (F1 2014 was one example) the game felt like it had more latency compared to vsync off.
Based on how freesync is implemented is it possible to have vysnc off in the game settings but have it forced on in the driver so the game doesn’t try to apply its own frame limiting or other problematic vsync like elements to a VRR target?
The second point I wanted to make is that I feel the best of both worlds would be vsync on to the maximum for VRR and vsync off at the minimum but with maximum refresh rate. That way you do get the updates as quick as possible. Neither solution actually goes here but I can’t say I have been unhappy with gsync’s solution, its actually been really amazing (the ROG Swift not so much, that monitor is a disaster, on my 4th RMA)
articles like this are why
articles like this are why pcper is the best
Alright, let me get this
Alright, let me get this straight. What you, guys, basically saying here is that, literally, Gay-Sync module has a build-in cash for double/triple buffering, which it uses only at low FPS. It doesn’t produce stutter at lower FPS when it doubles/triples the refresh frequency, because it stocks up ahead several cloned frames which it then uses to smooth up things. To put it simply: when you think it renders one frame at low FPS, it actually renders one WHILE storing up to three absolute same ones in it’s build-in cash at the same time. And it ups the refresh frequency simply to move out those stocked up frame-clones as fast as possible, thus why no stuttering. It’s basically double/triple buffering, but, unlike the typical frame buffering methods, this one is instant since the stocked-up clone frames are ready to be moved out instantly thanks to the doubled/tripled refresh frequency. I hope I got this right.
Anyway, either way, I think I still prefer FreeSync more so far, simply because it’s: 1) open, 2) cheaper to implement/no price premium, 3) technically supports much lower refresh rates than Gay-Sync module would ever allow to go to without all this “triple buffering + higher refresh frequency” voodoo mumbo-jumbo magic. I feel like nGreedia simply tries to make us buy their proprietary crap again, and no matter how better it may look like at the lower FPS, I still think that correctly applied FreeSync would still be a better route to go simply of how much cheaper and more versatile it is. There’s just way too many restrictions with nGreedia’s stuff, almost always. I want to be FREE.
Master Chen you really come
Master Chen you really come across like an ignorant homophobic hater, so unless someone gives you the hardware for free there’s nothing free about freesync despite its name.
Cry more, brainwashed
Cry more, brainwashed marketing victim sheep noVideot. You make me laugh.
Tone it down children.
Tone it down children. Attack the site, tech and the companies all you like but keep the personal stuff to a minimum.
Unless it is a really good zinger, those always get to stay.
You truly amuse me, Jeremy.
You truly amuse me, Jeremy. You’re one to talk here about “companies”, lol…NOT.
Great video and explains a
Great video and explains a lot of why gaming is so smooth on the Swift. Can you confirm to settle an argument that 30fps on the swift is actually running at 60Hz with an extra frame being sent, so in effect is 60 fps?
You not serious are you?
You not serious are you? April Fools?
What I’m taking away from
What I’m taking away from this is that the ball is really in the panel maker’s court. They need to get the lower end of the VRR window down to 30Hz, after that if your FPS is dropping lower than that, lower you’re settings. That makes any screen tearing and artifacts you get with freesync effectively an alarm. 😉
A few questions.
1) If the
A few questions.
1) If the Asus RoG Swift is showing doubling frames at 36hz why is the G-Sync rated at 30hz-144hz? Anyone bothered to ask about the discrepancy.
2) Power consumption differences BenQ is rated at 65watts. Asus RoG Swift is rated at 90watts. What are the additional 25watts doing? Asus RoG swift owners have said in forums they can notice a green led light that’s always on inside the enclosure. What’s that about ?
3) If G-Sync is doing some sort of PSR why cant it do it at say 60hz, 120hz or 144hz ? Why only when it drops out of VRR ?
I’m confused. What changed?
I’m confused. What changed? I’ve been gaming for decades, and we never needed some overpriced monitor to get 60 fps. That’s really all I care to experience, as I honestly don’t think I can tell the difference above 60 fps (yes, I’ve taken the tests…). So what changed? Is this the result of lazy engineering? Pushing the performance burden off to the monitors? Or are the GPU manufacturers hyping up this new tech and playing a branding game, or what? Or are resolutions becoming so high that they cannot keep up with fps demand, even up to (my) golden 60 frames? I mean, gosh, doesn’t anyone but me play in 1280×1024 anymore?? Now we have 4k out there… I’m going to need to go get my glasses prescription updated just to see the difference. Such inflation of standards. *sigh* Sorry this turned into a “back in my day” post. Yes, I’m getting old. 🙂
I think you need to read what
I think you need to read what does G-sync and Free-sync do in order to understand that this is not just about getting a monitor to 60 or 144hz.
You don’t talk about the
You don’t talk about the disadvantages of G-Sync: no multiple input, displayport only, restricted menu settings due to G-Sync, additional costs and so on.
Manufacturer could add more
Manufacturer could add more port and OSD if they want but you would still need to use DisplayPort in order to enable G-Sinc (and this apply to AMD’s FreeSync too).