What is FreeSync?
We have our first set of AMD FreeSync monitors in the office and are ready to talk about the variable refresh experience they offer.
FreeSync: What began as merely a term for AMD’s plans to counter NVIDIA’s launch of G-Sync (and mocking play on NVIDIA’s trade name) has finally come to fruition, keeping the name – and the attitude. As we have discussed, AMD’s Mantle API was crucial to pushing the industry in the correct and necessary direction for lower level APIs, though NVIDIA’s G-Sync deserves the same credit for recognizing and imparting the necessity of a move to a variable refresh display technology. Variable refresh displays can fundamentally change the way that PC gaming looks and feels when they are built correctly and implemented with care, and we have seen that time and time again with many different G-Sync enabled monitors at our offices. It might finally be time to make the same claims about FreeSync.
But what exactly is FreeSync? AMD has been discussing it since CES in early 2014, claiming that they would bypass the idea of a custom module that needs to be used by a monitor to support VRR, and instead go the route of open standards using a modification to DisplayPort 1.2a from VESA. FreeSync is based on AdaptiveSync, an optional portion of the DP standard that enables a variable refresh rate courtesy of expanding the vBlank timings of a display, and it also provides a way to updating EDID (display ID information) to facilitate communication of these settings to the graphics card. FreeSync itself is simply the AMD brand for this implementation, combining the monitors with correctly implemented drivers and GPUs that support the variable refresh technology.
A set of three new FreeSync monitors from Acer, LG and BenQ.
Fundamentally, FreeSync works in a very similar fashion to G-Sync, utilizing the idea of the vBlank timings of a monitor to change how and when it updates the screen. The vBlank signal is what tells the monitor to begin drawing the next frame, representing the end of the current data set and marking the beginning of a new one. By varying the length of time this vBlank signal is set to, you can force the monitor to wait any amount of time necessary, allowing the GPU to end the vBlank instance exactly when a new frame is done drawing. The result is a variable refresh rate monitor, one that is in tune with the GPU render rate, rather than opposed to it. Why is that important? I wrote in great detail about this previously, and it still applies in this case:
The idea of G-Sync (and FreeSync) is pretty easy to understand, though the implementation method can get a bit more hairy. G-Sync (and FreeSync) introduces a variable refresh rate to a monitor, allowing the display to refresh at wide range of rates rather than at fixed intervals. More importantly, rather than the monitor dictating what rate this refresh occurs at to the PC, the graphics now tells the monitor when to refresh in a properly configured G-Sync (and FreeSync) setup. This allows a monitor to match the refresh rate of the screen to the draw rate of the game being played (frames per second) and that simple change drastically improves the gaming experience for several reasons.
Gamers today are likely to be very familiar with V-Sync, short for vertical sync, which is an option in your graphics card’s control panel and in your game options menu. When enabled, it forces the monitor to draw a new image on the screen at a fixed interval. In theory, this would work well and the image is presented to the gamer without artifacts. The problem is that games that are played and rendered in real time rarely fall into a very specific frame rate. With only a couple of exceptions, games frame rates will fluctuate based on the activity happening on the screen: a rush of enemies, a changed camera angle, an explosion or falling building. Instantaneous frame rates can vary drastically, from 30, to 60, to 90, and force the image to be displayed only at set fractions of the monitor's refresh rate, which causes problems.
If a frame takes more than the standard refresh time of a monitor to draw (if the frame rate drops low) then you will see a stutter, or hitch, in the game play that is caused by the display having to re-draw the same frame for a second consecutive interval. Any movement that was tracking on the screen would suddenly appear to stop – and then would quickly “jump” to the next location faster than your mind thought it should. V-Sync also inherently introduces input latency to games if these lag and stutters take place.
The common alternative for gamers worried about latency and stutter was to disable V-Sync in the control panel or in game. This solves the stutter and latency issues (kind of) but cause another, much more noticeable issue, called "tearing". With V-Sync disabled, the graphics card sends a new frame to the monitor at any point in the monitors refresh cycle, even if the LCD is currently drawing. The result is that, at some point down the screen, the user sees the previous frame above, and the current frame below, at a sharp line. You will literally be seeing an image where the geometry is no longer lined up and, depending on the game and scene, can be incredibly distracting.
Monitors with refresh rates higher than 60 Hz reduce this tearing by having more frequent screen refreshes, and thus a tear is less likely to occur in any single refresh cycle, but the tearing is impossible to remove completely.
NVIDIA G-Sync (and FreeSync) switches things up by only having the monitor refresh its screen when a new frame is ready from the GPU. As soon as the next frame is drawn it can be based to the display and drawn on the screen without tearing. If the next frame is ready in 16ms, it can be sent immediately. If it takes 25ms or only 10ms, it doesn’t matter, the monitor will wait for information from the GPU to draw the new frame. The result is an incredibly smooth and fluid animation that doesn’t stutter and doesn’t tear.
So since we know what G-Sync provides, what does AMD FreeSync want to do differently? The company hangs its hat on three distinct keys: no proprietary hardware, no closed standards and no licensing fees. Let’s address the licensing fee first. I have it from people that I definitely trust that NVIDIA is not charging a licensing fee to monitor vendors that integrate G-Sync technology into monitors. What they do charge is a price for the G-Sync module, a piece of hardware that replaces the scalar that would normally be present in a modern PC display. It might be a matter of semantics to some, but a licensing fee is not being charged, as instead the vendor is paying a fee in the range of $40-60 for the module that handles the G-Sync logic.
That leads us to AMD’s claims that FreeSync doesn’t require proprietary hardware, and clearly that is a fair and accurate benefit of FreeSync today. Displays that support FreeSync can use one of a number of certified scalars that support the DisplayPort AdaptiveSync standard. The VESA DP 1.2a+ AdaptiveSync feature is indeed an open standard, available to anyone that works with VESA while G-Sync is only offered to those monitor vendors that choose to work with NVIDIA and purchase the G-Sync modules mentioned above.
Unfortunately, not all AMD Radeon graphics cards today will be able to run FreeSync monitors in their variable refresh state. For discrete GPUs, only the R9 290X, R9 290, R9 285, R7 260X and R7 260 have the ability to properly communicate with the FreeSync displays and offer users the advantages of VRR. If you own an HD 7000 series card or even an R9 280 or R9 270, you are going to need to upgrade your GPU in addition to your monitor to take advantage of the technology. For NVIDIA users though, G-Sync is supported on GTX 900-series, 700-series and 600-series cards nearly across the board, in part thanks to the implementation of the G-Sync module itself.
Let’s take a look at this AMD-created table above that compares FreeSync to G-Sync. We have already discussed the module requirement, open standards and licensing fees, but what about the other areas that AMD claims to hold the advantage? Because FreeSync monitors will use standard scalars from existing companies, they will have the full gamut of features that you expect in an LCD monitor. That includes audio output, internal scaling (for resolution changes), color processing, more input options and more. It’s been obvious when reviewing G-Sync monitors that some of these concerns stand out – only getting a single DisplayPort input is frustrating because there will likely be cases where you want to use your VRR capable monitor in a non-VRR configuration with an HDMI cable or DVI connection. You cannot do that with current generation G-Sync displays, it’s DisplayPort or nothing. FreeSync monitors will continue to offer a range of input options at the discretion of the display vendor.
The published refresh rate is more than a bit misleading. AMD claims that FreeSync is rated at 9-240 Hz while G-Sync is only quoted at 30-144 Hz. There are two issues with this. First, FreeSync doesn’t determine the range of variable refresh rates, AdaptiveSync does. Second, AMD is using the maximum and minimum refresh range published by the standards body, not an actual implementation or even a planned implementation from any monitor or display. The 30-144 Hz rating for G-Sync is actually seen in shipping displays today (like the ASUS PG278Q ROG Swift). The FreeSync monitors we have in our office are either 40-144 Hz (TN, 2560×1440) or 48-75 Hz (IPS, 2560×1080); neither of which is close to the 9-240 Hz seen in this table.
We’ll touch on the implications of a performance degradation on G-Sync later in the story.
If I already have a 144hz
If I already have a 144hz monitor, does it make any sense to reinvest into gsync/freesync?
This is what Anandtech says, do you agree?
“…something that has become more noticeable to me over the past few months. While G-SYNC/FreeSync can make a big difference when frame rates are in the 40~75 FPS range, as you go beyond that point the benefits are a lot less clear. Take the 144Hz ASUS ROG Swift as an example. Even with G-SYNC disabled, the 144Hz refresh rate makes tearing rather difficult to spot, at least in my experience. Considering pixel response times for LCDs are not instantaneous and combine that with the way our human eyes and brain process the world and for all the hype I still think having high refresh rates with VSYNC disabled gets you 98% of the way to the goal of smooth gaming with no noticeable visual artifacts (at least for those of us without superhuman eyesight).”
I think the advantage you’ll
I think the advantage you’ll see with 144Hz + VRR vs only 144 Hz depends a lot on the FPS you target while gaming. I have a 144Hz 1080p G-Sync display + GTX970. For games like Dragon Age Inquisition, I change settings to get 55-75 fps w G-Sync. At this fps, I don’t notice any fps changes while playing. If I had SLI 970s I could maybe target close to 144 Hz. Then Triple buffer V-Sync could still look quite good (144 Hz to 60 Hz would be tough to see). I could also turn off V-sync and deal with lessened screen tearing that 144Hz brings.
If you play CS:GO, VRR is less worth it as you can already hit 144Hz easily even with a single GPU. You wouldn’t see much increase in quality in that case. Also, if you play games in borderless-window mode, G-Sync wont work with that. Games have to be in a true fullscreen mode.
If money is no object, G-Sync is always worth upgrading to. The image quality really is much better with zero-screen tearing and judder. Smoothly panning in 1st person and 3rd person games feels very good. I can notice tearing quite easily (I don’t know how other people can’t), so if you REALLY don’t care about tearing, then wait to upgrade. If I were you, I’d wait for the 2560×1440 144Hz IPS or 3440×1440 144Hz IPS monitors that should be released soon.
it was always obvious that
it was always obvious that pcper were nvidia fan boys, but this latest “ghosting” video just confirms it.
It’s obvious that you are
It’s obvious that you are BLIND to the truth. Just because someone conducts a thorough review of a specific product or function and finds negative results does NOT mean they are a fanboy. They are simply providing us with their findings. Go ahead and ingnore them. Go buy that FreeSync monitor. We just want you to come back crying about buyer’s remorse, because all you will get is a resounding “WE TOLD YOU SO!”. Learn how to be open minded. It’s not about GSYNC being superior to FreeSync (which it is by the way), it’s about FreeSync will have to go through growing pains before it is as good as GSYNC.
You’re a fucking ignorant
You’re a fucking ignorant dumb ass, off with you AMDbutthurt fanboy
And you’re an obvious
And you’re an obvious uninformed ignorant dumb ass AMD fanboy!
Allyn can’t even tell the
Allyn can’t even tell the difference in panels.
All he had to do was look at the numbers since he has them both there.
I am laughing at all the AMD
I am laughing at all the AMD fanbois….
PCPER was all over the AMD FCAT debacle, They were right then and they are right now. The Truth is coming. I hope you AMD fanbois can handle it. I own AMD 290s in CFX and they work fine, but AMD is a dying horse. Unless the 300 series is epic. If they are great but I wouldn’t hold my breath.
The only thing keeping AMD
The only thing keeping AMD alive is their graphics cards and they are barely breaking even on their horribly crappy APUs. If Nvidia were to wake up one day and decide to cut all of their graphics card prices to match AMD’s prices (per range of graphics cards by performance), AMD would go bankrupt in 6 months. Intel could do the exactly same thing to AMD in the CPU market. But alas, competition works out for both parties in the end.
Double post – sorry!
Double post – sorry!
I really haven’t seen it
I really haven’t seen it mentioned in my glancing through the comments… But do you guys at PCPER think this ghosting might have been a downside that Nvidia knew about, while they explored this option, and thats why they decided on doing their own scaler?
I don’t think so. Nvidia has
I don’t think so. Nvidia has been working on GSYNC for over 2 years now and there was no way they could predict whether or not ghosting would become a factor in monitors 1 to 2 years down the road. Pixel response times have gone down since then. TN panels still have the very best pixel response times and thus are the best choice to implement GSYNC just for the fact that they already have a reduced latency when drawing the rendered frames in near-realtime. There are IPS-based panels (i.e. AHVA) that are getting close to TN panel response times, but are not quite there just yet. The new Acer XB270HU is an IPS panel with a 144Hz refresh, GYSNC, and a whopping 4ms response time. Retail MSRP is rumored to be at around $800, which is friggin’ awesome and will effectively bring down the price of the ROG Swift into the ~$650 price range. The Acer XB780HU is about as close as you will get to the Asus ROG Swift in terms of overall performance and quality in a dynamic refresh rate monitor. Expected availability is 15 April…I can’t wait!
I have to wonder, considering
I have to wonder, considering AMD has a certification process, who decided these monitors release to market ready. Somethin smell like the dead fishes Lucy…
Lt.Ripple *burp* signing off
This is by far the very best
This is by far the very best article I have read that provides a thorough explanation of how both GSYNC and FreeSync work as well as the differences (pros & cons) between the two technologies. Job well done to PC Perspective and Ryan Shrout!
Thank you for covering this
Thank you for covering this and in such depth! This site is my new replacement for Anandtech!
I’ve had an upgraded (DIY
I’ve had an upgraded (DIY kit) Asus VG248QE since last summer. I haven’t noticed any issues with flicker, but it could be that I’m simply not noticing it.
Battlefield Hardline locks the framerate during the cut scenes to 30FPS, so perhaps I should go back to take a peek. Is there a easy way to spot what you’re describing? It’s likely I will buy the Acer 1440P IPS at some point to replace the 24″ Asus, and I’m curious about the improvements in the gsync implimentation.
Also, I’ll be helping a friend set-up his BenQ xl2730z sometime in the next couple days. It will be interesting to see if I can spot a ghosting issue in-person. If there any chance the issue is due to a preprodution review unit?
Alan, is your BenQ a press review sample or a retail box?
What’s your experience with
What’s your experience with the gsync versus “just” 144hz? How noticeable is the difference?
The first few months after I
The first few months after I swapped from a 60hz panel to a 144hz panel, I noticed an immediate quality of life improvement from a couple things:
1) Tracking targets in FPS games was easier, more akin to a high-end CRT back in the day. I believe this was due both to the higher refresh rate and the reduced ghosting of a faster panel.
2) When flying Empire Specific Fighters in Planetside 2, I found it much easier to percieve my plane’s sense of momentum.
After installing the gsync DIY kit, benefit #2 above became far more pronouced, as the reduced tearing (I never played with vsync on) made everything feel more fluid.
My roommate, who plays on a 1920×1200 60hz IPS has said countless times that he’s jealous of my monitor after watching me play, despite his displays much better color quality. He intends to order the Acer 144hz IPS 1440P model the day it becomes available in the US, regardless of the price.
I went to a friend’s house for a 4-person battlefield Hardline LAN this last weekend. When watching another player’s monitos, I could see the screen tearing from across the room.
The bottom line is, I would never willingly go back to a fixed refresh display after living with a variable refresh monitor for the last 7-8 months. Even if I were replace the monitor I’ve already invested $500 into to get a 27″ 1440P version this year, I still feel that the money was well spent to simply experience the benefit for the year prior. It’s that good from my perspective.
It’s a definite step up from
It’s a definite step up from “just 144hz”. No question in my mind. If you can afford the cost to buy a variable refresh display, do it. You won’t look back.
Subjectively, the improvement
Subjectively, the improvement from 60hz to 144 and then from 144hz to 144hz gsync was equal in terms of overall benefit. I would never consider going back after 7-8 months of game time with a variable refresh display.
Forbes just posted an
Forbes just posted an interesting interview with Tom Peterson regarding why he thinks GSYNC is superior to FreeSync – it is a great read and tells us that PC Perspective got a lot right with this article.
A nvidia rep claiming
A nvidia rep claiming nvidia’s better o_O
quelle surprise!
Brad Chacos from PCWorld
Brad Chacos from PCWorld denied that his LG 34UM67 shows ghosting. The fact that no computer tech site – outside PCper – did not notice ghosting, raises doubts to your findings. Looking at the surface of the wings on the film I see different contrast settings for different monitors. Can you specify the panel settings, at which you tested ghosting? This will allow the confirmation of your observations.
http://www.pcworld.com/article/2900901/acers-500-amd-freesync-monitor-drastically-undercuts-nvidia-g-sync-pricing.html
Wrong you uninformed idiot.
Wrong you uninformed idiot. Guru3D also mentions the ghosting among some other EU tech sites.
Hmmm Selective frame grabbing
Hmmm Selective frame grabbing eh pcper?
Easy to find the ROG Swift ghosting too if you simply play and pause the video at random intervals…
Try it people, they ALL do it.
Love all these haters and
Love all these haters and trolls bashing on PCPEr all the time. I hope you get paid for it or you’re just a waste of life and bandwidth.
Hope your IP’s will get banned by the site soon.
“The contrast on the BenQ and
“The contrast on the BenQ and AMA settings is set far to high, because it’s a competitive gaming monitor they reduce all the brightness and colour to make it easy to spot people.. My older BenQ was the same..”
DITTO… This is exactly what I was going to say. The brightness, contrast and colour is different from panel to panel. The one with Nvidia G-Sync is clearly lower from just observing the blacks and whites.
Ghosting is most likely there for nvidia g-sync if you put the brightness, contrast and colour to the same level as Free-sync monitors in this case.
Please test: brightness and contrast
The Reds are darker in the
The Reds are darker in the g-sync panel vs the free-sync panels. So maybe We are not seeing ghosting in nvidia g-sync maybe bcoz the brightness and contrast is set lower. We need to put the contrast up and see what happens. Thnaks
TFTCentral put out an update
TFTCentral put out an update to the panels used in this review on their twitter account. See below
TFTCentral
@TFTCentral
Confirmation that the BenQ XL2730Z is using an AU Optronics M270DTN01.0 TN Film panel. Different to the Asus ROG Swift PG278Q (M270Q002 V0)
From my understanding gsync
From my understanding gsync does revert to vsync on when you hit the max refresh of the monitor.. but I’ve been told by several swift owners that things have changed with driver updates last year and now nvidia is limiting fps to just below the refresh cap so that it is always using gsync when it is enabled, and avoiding the added latency of vsync at all times by sacrificing that last 1fps.
Have you guys at pcper not tested or run into this during your time with gsync? I think this was relayed to me at or just before the 340 series drivers launched.
I can confirm that my gsync
I can confirm that my gsync VG248QE will not hit 144FPS in Diablo 3 regardless of the display settings in the game (on a 780ti and 4790k @4.4ghz). It seems to cap a few frames below 144, but I honestly thought it was due to D3. Maybe it is a driver thing…
Gsync in my experience, and
Gsync in my experience, and from what I heard from others, does not work in D3. Which sucks, because if one game could really benefit from gsync, it was D3.
So when is the promised
So when is the promised follow-up to this “preview” then? Looking foward to the pixperan testing and hopefully game examples of this ghosting. Need to see if its game breaking or not.
I configured a BenQ xl2730z
I configured a BenQ xl2730z tonight for a friend’s system, w/ a R9 290x. To my eye, it looked great. I looked at some of the TestUFO animations and didn’t see anything alarming in the 15-20 minutes I messed with it. It seems like a great value at the price.
Planning on amending the
Planning on amending the review in the light of the fact the panels aren’t the same as you stated?