What is FreeSync?
We have our first set of AMD FreeSync monitors in the office and are ready to talk about the variable refresh experience they offer.
FreeSync: What began as merely a term for AMD’s plans to counter NVIDIA’s launch of G-Sync (and mocking play on NVIDIA’s trade name) has finally come to fruition, keeping the name – and the attitude. As we have discussed, AMD’s Mantle API was crucial to pushing the industry in the correct and necessary direction for lower level APIs, though NVIDIA’s G-Sync deserves the same credit for recognizing and imparting the necessity of a move to a variable refresh display technology. Variable refresh displays can fundamentally change the way that PC gaming looks and feels when they are built correctly and implemented with care, and we have seen that time and time again with many different G-Sync enabled monitors at our offices. It might finally be time to make the same claims about FreeSync.
But what exactly is FreeSync? AMD has been discussing it since CES in early 2014, claiming that they would bypass the idea of a custom module that needs to be used by a monitor to support VRR, and instead go the route of open standards using a modification to DisplayPort 1.2a from VESA. FreeSync is based on AdaptiveSync, an optional portion of the DP standard that enables a variable refresh rate courtesy of expanding the vBlank timings of a display, and it also provides a way to updating EDID (display ID information) to facilitate communication of these settings to the graphics card. FreeSync itself is simply the AMD brand for this implementation, combining the monitors with correctly implemented drivers and GPUs that support the variable refresh technology.
A set of three new FreeSync monitors from Acer, LG and BenQ.
Fundamentally, FreeSync works in a very similar fashion to G-Sync, utilizing the idea of the vBlank timings of a monitor to change how and when it updates the screen. The vBlank signal is what tells the monitor to begin drawing the next frame, representing the end of the current data set and marking the beginning of a new one. By varying the length of time this vBlank signal is set to, you can force the monitor to wait any amount of time necessary, allowing the GPU to end the vBlank instance exactly when a new frame is done drawing. The result is a variable refresh rate monitor, one that is in tune with the GPU render rate, rather than opposed to it. Why is that important? I wrote in great detail about this previously, and it still applies in this case:
The idea of G-Sync (and FreeSync) is pretty easy to understand, though the implementation method can get a bit more hairy. G-Sync (and FreeSync) introduces a variable refresh rate to a monitor, allowing the display to refresh at wide range of rates rather than at fixed intervals. More importantly, rather than the monitor dictating what rate this refresh occurs at to the PC, the graphics now tells the monitor when to refresh in a properly configured G-Sync (and FreeSync) setup. This allows a monitor to match the refresh rate of the screen to the draw rate of the game being played (frames per second) and that simple change drastically improves the gaming experience for several reasons.
Gamers today are likely to be very familiar with V-Sync, short for vertical sync, which is an option in your graphics card’s control panel and in your game options menu. When enabled, it forces the monitor to draw a new image on the screen at a fixed interval. In theory, this would work well and the image is presented to the gamer without artifacts. The problem is that games that are played and rendered in real time rarely fall into a very specific frame rate. With only a couple of exceptions, games frame rates will fluctuate based on the activity happening on the screen: a rush of enemies, a changed camera angle, an explosion or falling building. Instantaneous frame rates can vary drastically, from 30, to 60, to 90, and force the image to be displayed only at set fractions of the monitor's refresh rate, which causes problems.
If a frame takes more than the standard refresh time of a monitor to draw (if the frame rate drops low) then you will see a stutter, or hitch, in the game play that is caused by the display having to re-draw the same frame for a second consecutive interval. Any movement that was tracking on the screen would suddenly appear to stop – and then would quickly “jump” to the next location faster than your mind thought it should. V-Sync also inherently introduces input latency to games if these lag and stutters take place.
The common alternative for gamers worried about latency and stutter was to disable V-Sync in the control panel or in game. This solves the stutter and latency issues (kind of) but cause another, much more noticeable issue, called "tearing". With V-Sync disabled, the graphics card sends a new frame to the monitor at any point in the monitors refresh cycle, even if the LCD is currently drawing. The result is that, at some point down the screen, the user sees the previous frame above, and the current frame below, at a sharp line. You will literally be seeing an image where the geometry is no longer lined up and, depending on the game and scene, can be incredibly distracting.
Monitors with refresh rates higher than 60 Hz reduce this tearing by having more frequent screen refreshes, and thus a tear is less likely to occur in any single refresh cycle, but the tearing is impossible to remove completely.
NVIDIA G-Sync (and FreeSync) switches things up by only having the monitor refresh its screen when a new frame is ready from the GPU. As soon as the next frame is drawn it can be based to the display and drawn on the screen without tearing. If the next frame is ready in 16ms, it can be sent immediately. If it takes 25ms or only 10ms, it doesn’t matter, the monitor will wait for information from the GPU to draw the new frame. The result is an incredibly smooth and fluid animation that doesn’t stutter and doesn’t tear.
So since we know what G-Sync provides, what does AMD FreeSync want to do differently? The company hangs its hat on three distinct keys: no proprietary hardware, no closed standards and no licensing fees. Let’s address the licensing fee first. I have it from people that I definitely trust that NVIDIA is not charging a licensing fee to monitor vendors that integrate G-Sync technology into monitors. What they do charge is a price for the G-Sync module, a piece of hardware that replaces the scalar that would normally be present in a modern PC display. It might be a matter of semantics to some, but a licensing fee is not being charged, as instead the vendor is paying a fee in the range of $40-60 for the module that handles the G-Sync logic.
That leads us to AMD’s claims that FreeSync doesn’t require proprietary hardware, and clearly that is a fair and accurate benefit of FreeSync today. Displays that support FreeSync can use one of a number of certified scalars that support the DisplayPort AdaptiveSync standard. The VESA DP 1.2a+ AdaptiveSync feature is indeed an open standard, available to anyone that works with VESA while G-Sync is only offered to those monitor vendors that choose to work with NVIDIA and purchase the G-Sync modules mentioned above.
Unfortunately, not all AMD Radeon graphics cards today will be able to run FreeSync monitors in their variable refresh state. For discrete GPUs, only the R9 290X, R9 290, R9 285, R7 260X and R7 260 have the ability to properly communicate with the FreeSync displays and offer users the advantages of VRR. If you own an HD 7000 series card or even an R9 280 or R9 270, you are going to need to upgrade your GPU in addition to your monitor to take advantage of the technology. For NVIDIA users though, G-Sync is supported on GTX 900-series, 700-series and 600-series cards nearly across the board, in part thanks to the implementation of the G-Sync module itself.
Let’s take a look at this AMD-created table above that compares FreeSync to G-Sync. We have already discussed the module requirement, open standards and licensing fees, but what about the other areas that AMD claims to hold the advantage? Because FreeSync monitors will use standard scalars from existing companies, they will have the full gamut of features that you expect in an LCD monitor. That includes audio output, internal scaling (for resolution changes), color processing, more input options and more. It’s been obvious when reviewing G-Sync monitors that some of these concerns stand out – only getting a single DisplayPort input is frustrating because there will likely be cases where you want to use your VRR capable monitor in a non-VRR configuration with an HDMI cable or DVI connection. You cannot do that with current generation G-Sync displays, it’s DisplayPort or nothing. FreeSync monitors will continue to offer a range of input options at the discretion of the display vendor.
The published refresh rate is more than a bit misleading. AMD claims that FreeSync is rated at 9-240 Hz while G-Sync is only quoted at 30-144 Hz. There are two issues with this. First, FreeSync doesn’t determine the range of variable refresh rates, AdaptiveSync does. Second, AMD is using the maximum and minimum refresh range published by the standards body, not an actual implementation or even a planned implementation from any monitor or display. The 30-144 Hz rating for G-Sync is actually seen in shipping displays today (like the ASUS PG278Q ROG Swift). The FreeSync monitors we have in our office are either 40-144 Hz (TN, 2560×1440) or 48-75 Hz (IPS, 2560×1080); neither of which is close to the 9-240 Hz seen in this table.
We’ll touch on the implications of a performance degradation on G-Sync later in the story.
How do you not have a table
How do you not have a table comparing the refresh rate ranges on these FreeSync monitors???
I’m appalled at the number of
I’m appalled at the number of reviews out there that are only testing the MAX refresh limit of FreeSync monitors. Pcper doesn’t stoop to such levels, and that’s why your reviews are some of the best of the best.
I’m also disappointed at how AMD has misled it’s potential customers, by trying to call 40-144Hz ranges on monitors that have 48 (LG) and 56Hz (BenQ) minimums just because non-FreeSync supported ranges are 24Hz on those monitors. 40Hz just so happens to fall between 24Hz and 56Hz meaning minimums probably won’t be getting any lower [than 48Hz] for a while. *shakes finger @ AMD*
Ryan your review confirmed a lot of the concerns I had after reading into FreeSync after it was revealed that the BenQ XL2730Z was launching in EU before North American.
I would buy a G-Sync monitor or just plain old 144Hz before touching a FreeSync monitor.
Yeah, it is better to buy
Yeah, it is better to buy defective monitors with better ranges that gives you flickering because the ranges are crap.
Huh? You are not making any
Huh? You are not making any sense. What?
Turn Freesync on and off and
Turn Freesync on and off and see if it has the same ghosting issue, even use a none Freesync compatible port and see if it has the same ghosting issue otherwise its pure speculation due to lack of a methodical approach and nothing more.
If you turn on blur reduction
If you turn on blur reduction the BENQ drops out of FreeSync mode, so you can't have both at this point.
Is it possible for Nvidia to
Is it possible for Nvidia to implement FreeSync in the future into their existing GPUs like the 900 series, if they chose to?
Yes, it would be a simple
Yes, it would be a simple implementation by providing DisplayPort 1.2a (or better; i.e. DisplayPort 1.3) and updating the Nvidia drivers to support the VESA Adaptive Sync standard. However, Nvidia has stated that they have no plans to do so. I for one believe that GSYNC is superior to the initial implementation of the Adaptive Synce standard as is proven by the hardware reviews showing the difference in real world performance between the two different technologies. This is mainly due to the fact that the GSYNC module has onboard memory to allow for frame buffering and frame rendering multiplication (and re-rendering of frames above the maximum native refresh rate of the panel) whereas the Adaptive Sync standard does not take this into account and is not able to replicate that process on a driver level only.
Is the DP in GTX 970 1.2?
Is the DP in GTX 970 1.2? Otherwise it won’t be possible to do this.
…What no youtube video
…What no youtube video review?
(Seriously lol Im not being sarcastic)
Ryan and Ken were both out
Ryan and Ken were both out covering GTC. We are going to do a video today.
Hi @Ryan Shrout
Did you read
Hi @Ryan Shrout
Did you read this ? Please try to Test AMA.
http://forums.overclockers.co.uk/showpost.php?p=27798666&postcount=170
Just tried it here. AMA
Just tried it here. AMA options are off, high, and premium. Default is high. Changed between all three settings with the FreeSync windmill demo running and saw no change in ghosting at all (it still does it).
Who should I believe? Your
Who should I believe? Your comment or the forum post?
Regardless of that, your test implies the ghosting issue is neither affected by FreeSync nor AMA options. That could only mean the panel itself is at fault.
That means the panel is to blame, not FreeSync.
The ghosting issue is present
The ghosting issue is present on all three FreeSync monitors tested and each of them use a different panel. The panel’s fault? I think not. I smell something funny in the AMD camp…smells like wrotten FreeSync 😛
Well the guy in the forum
Well the guy in the forum post would not have had a FreeSync driver (it was not out yet), so it's doubtful that he was in VRR mode.
The drivers was released on
The drivers was released on the 03/19 on AMD website. He made his post on 3/19 22:08. He had almost an entire day to play with it.
Has AMD mentioned anything
Has AMD mentioned anything about future support for this on their APUs? I could see this fitting in that ecosystem. I’m going to assume there is no support for the current line up of APUs because none of them use the gpu architectures in the support list in the article.
Is it really that tough to
Is it really that tough to use Google? It took me 1 minute to find this:
http://shop.amd.com/en-us/promo?k=freesyncprocessor&promo=fsapu
well 100+posts down no-one is
well 100+posts down no-one is probably gona read this, but I HAVE NEVER SEEN RYAN AND ALLYN ANSWER SO MUCH.
posibility #1 this is an important issue and are watching the comments to clarify, ORRRRRRRRRR
Posibility #2 there is some job at the office that needs to get done BUTTTT it’s a super tedious boring job and distractions are manna from heaven
I’m guessing #2.
Ryan was on travel or he
Ryan was on travel or he would have been more active. We have to stay active in posts that are technical and also bring out the folks who are adamant about their particular / favorite display technology. It helps minimize the spread of misinformation.
Beautiful. That’s why I love
Beautiful. That’s why I love this place.
Probability #1
They are brave
Probability #1
They are brave and willing to confront with all reviewers that gave flying colors to this half-baked AMD s*th.
Why so mad?
Go get a hobby
Why so mad?
Go get a hobby
Why are you trolling? GET A
Why are you trolling? GET A LIFE!
I’m disappointed in the LG
I’m disappointed in the LG panel’s performance. I really only like large display’s and would like to get one. I’m currently on a 32 inch IPS 1080p tv, and could switch to one of the larger IPS monitor variants, but not if adaptive sync only works between 48-75hz like the LG. I’m sure the IPS monitors will get better, but I might just go with a 40-45 inch 4k tv. It’s hard to decide when it’s going to be my new living room tv and my pc monitor.
Question, or posible Edit, “
Question, or posible Edit, ” For discrete GPUs, only the R9 290X, R9 290, R9 285…….” It was my understanding that the R9 295×2 was also on that list, is that no longer the case or are you not including it because it’s not a single CPU (does freesync work in crossfire at this point?)
More reading…….
Crossfire is not yet
Crossfire is not yet supported, so the latter.
Does this improve on making
Does this improve on making it less of an eye strain well looking at a computer display for long hours at a time?
Will it help people that use Excel, Documents, etc?
Not really. All of these
Not really. All of these displays remain at their max rate on the desktop. TN panels generally have less contrast so for desktop stuff I would recommend sticking with IPS (even despite the added ghosting of moving objects – IPS updates slower than TN).
Another way to look at it
Another way to look at it is:
• Regular Screen: Keep the fps at 60+ fps or experience stutter or screen tearing. (or what ever the monitor refresh is)
•Freesync: Keep the fps @40+ or experience stutter or screen tearing, (or what ever the screen manufacture put as minimum refresh rate of their panel), but ghosting due to no voltage control in the scaler for VRR.
No extra charge for the Adaptive refresh function since every new monitor will have this implemented from now on. Price right now is a consequence of market decisions, not technical.
•G-sync: Keep the fps @40+ or experience flicker due to optimistic frame refresh of the panel. (no panel to date can refresh at 30 fps without issues, no, really…)
But you still get the advantage of no stutter and no screen tearing and no ghosting @30 fps.
• G-synk: fps below 30 fps and still have a better than totally crappy experience…
About 200+ money to purchase the monitor. Only works with Nvidia graphics card. And that is not for a technical reason since the logic for this is in the module and not a function in the graphics card, An other word for this is DRM.
Is this about right?
Can we also have POWER
Can we also have POWER CONSUMPTION differences between Free-Sync and G-Sync, as power seems to be top priority for this site and most people. So lets have it then.
These are LED backlit panels
These are LED backlit panels for the most part. Power draw is a small fraction of a single GPU, and likely lower than CPU usage as well.
Nvidia is clearly behind the
Nvidia is clearly behind the curtains (Remember FCAT?) on many Free-Sync reviews/websites. I think AMD needs to step up and return the favour. Am surprised AMD haven’t done this already (They had a year to do it)
Poor showing from the AMD marketing/PR as always (Let the heads roll)
The FCAT issue was an AMD
The FCAT issue was an AMD fiasco and NOT Nvidia. AMD can’t step up because they are already getting caught in several lies:
1) Using the “FreeSync” moniker and company name to take false ownership and credit for the VESA Adaptive Sync standard.
2) Advertising monitors as having refresh rate ranges that are actually not supported by the panel itself (i.e. BenQ XL2730Z has a minimum refresh rate of 56Hz, and not the advertized 9Hz FreeSync minimum refresh rate).
3) Very limited support of FreeSync by existing AMD graphics cards because FreeSync actually is a proprietary hardware solution, just like Nvidia’s GSYNC (but FreeSync is nowhere near as good!).
4) Crossfire mode does NOT support FreeSync – SLI does support GSYNC!
5) FreeSync and Blur reduction can NOT be used at the same time – just like GSYNC and Ultra Low Motion Blur (ULMB) can not be used at the same time!
So, it is not looking good at all in the AMD camp right now. I expect several people from AMD to be fired over this whole debacle called FreeSync very soon.
So, in a 30-144 Hz G-Sync
So, in a 30-144 Hz G-Sync monitor, we have measured that when the frame rate actually gets to 29 FPS, the display is actually refreshing at 58 Hz, each frame being “drawn” one extra instance to avoid flicker of the pixels but still maintains a tear free and stutter free animation.
How can you describe redrawing the same frame twice because the next is not ready as “stutter free”? Is not the very act of showing the same frame, no matter how many times you refresh it the very definition of stutter.Do we have a placebo effect going on here?Are your eyes really fooled that easily?
Are you that stupid?!?
Are you that stupid?!?
@Mac Can confirm your idea of
@Mac Can confirm your idea of stuttering from Tom Petersen explaining G-Sync: https://www.youtube.com/watch?v=ZzJl2Ul1x_M
I really don’t understand what is G-Sync doing when framerate is beyond it’s variable refresh rates.
Yes I am, are you that
Yes I am, are you that clever? If so please enlighten my stupid self.
Pixels drift when left in
Pixels drift when left in refreshed for extended periods of time. Drawing an additional identical frame in the window of time where no other frame is incoming does not cause any additional stutter. It's just keeping the panel current.
According to this article,
According to this article, G-Sync doubles the refresh rate when the framerate drops below the VRR threshold.
So technically there is 2 refresh for one new frame.
According to Tom Petersen in this G-Sync explanation video(https://www.youtube.com/watch?v=ZzJl2Ul1x_M), if you refresh the same image twice (or multiple times), you induce stutter.
So keeping the panel current with (aka drawing) the same/identical image twice(more than once) causes stutter. Am I right?
I’d like to use VRR to turn
I’d like to use VRR to turn up the knobs on graphics features and have a smooth experience around 40-60 and handle when the frames drop to 20ish in a particularly intense scene. I suppose my main gripe with both is that you’re locked in to a certain system when you buy in today.
This is my summary/understanding of both systems as of now:
Freesync system (this includes AMD and their monitor partners as Freesync is all the software/hardware combined to provide a better experience)
Today:
+ lower cost
– currently AMD only
– 1 gpu only
– certain gcn gpu’s only
– ghosting issues
+ smooth gameplay within VRR range (what it’s supposed to do)
+ allows vsync off over VRR range which doesn’t add lag for twitch games
– only vsync off/on under VRR range, creates jarring effect if dropping in and out of VRR range (I see this as the main issue)
? do we see flickering when the game engine drops suddenly to 1fps?
Future updates should include: (this is a personal list that I would like to see to make the system perfect)
* crossfire support
* overdrive enabled during VRR to reduce/eliminate ghosting
* better handling of refresh below VRR range or panels that have a much lower min refresh rate.
Gsync system (this includes Nvidia and their monitor partners as Gsync is all the software/hardware combined to provide a better experience)
Today:
– costs hundreds more (This would be my main issue)
– only Nvidia
– kepler and newer
+ smooth gameplay within VRR range (also what it’s supposed to do)
– only vsync above VRR range, would need to limit frame rate to less than max panel refresh rate to eliminated vsync latency
+ smoothly handles frame rates below VRR range in retail products, beta diy units had an issue of judder below 20ish-30fps
– flickering when fps suddenly drops to 1fps like in certain menus
– most of these monitors only have dp input
Future updates should include: (this is a personal list that I would like to see to make the system perfect):
* lower the price
* allow licensing at least
* allow option of vsync off over VRR range
* wider selection of monitors with more inputs
* handle sudden drops in fps or have game developers aware that they shouldn’t do this
Bottom line if someone was forcing me to buy one system today I’d have to buy Gsync. It has quirks that have been worked around due to the amount of time it’s been on the market. The main issue in the Freesync system for me is a deal breaker because it’s onne of the main reasons to get a VRR system. We also don’t have as much info simply because it’s only been launched.
However as of now I’m still going to wait for gen2 of these products. I haven’t experienced VRR so I don’t know what I’m missing yet. Perhaps one system will address all of these issues.
I bought the 144 Hz 24″
I bought the 144 Hz 24″ G-Sync panel from AOC a few months ago when I did a system upgrade. I wanted to see what >60 Hz gaming and VRR tech was like as I had no way of seeing it in person before purchasing. It really is a huge upgrade in visual quality in games that support it. AMD’s (mostly) successful launch of FreeSync is great as it will drive prices down and bring this tech to AMD users also.
One of the downsides of VRR implementations today that is rarely mentioned, is that not all popular games support it. To get G-Sync, you have to play in fullscreen mode. Blizzard games (Diablo 3, Starcraft 2) especially do not correctly use fullscreen mode, preventing G-Sync.
The other downside is that color calibration profiles, like used by the Spyder4Express, do not work in fullscreen mode. So you can only have G-Sync by sacrificing color accuracy. ***please correct me if I’m wrong on this***
That said, Dragon Age Inquisition looks amazing with G-Sync + GTX970 @60-70 fps. Panning is buttery smooth. It really is a huge upgrade over non-VRR tech. Dota 2 with G-Sync @144 Hz solid looks super good too. It’s not as necessary though, as tearing and judder matter more in first and third person games that are slower paced. It you’re gawking at graphics in Dota 2 or SC2 you’re doing it wrong!
Ghosting on the BenQ… Is
Ghosting on the BenQ… Is they ghosting on the BenQ? With or without Freesync on stock factory settings Yes..
Is it possible to fix it with simple tweak? Yes..
The contrast on the BenQ and AMA settings is set far to high, because it’s a competitive gaming monitor they reduce all the brightness and colour to make it easy to spot people.. My older BenQ was the same..
Some tweaking of contrast lower down to 40% and change AMA to high from premium… Ghosting is now gone.. I also switch from FPS mode to standard.