What is FreeSync?
We have our first set of AMD FreeSync monitors in the office and are ready to talk about the variable refresh experience they offer.
FreeSync: What began as merely a term for AMD’s plans to counter NVIDIA’s launch of G-Sync (and mocking play on NVIDIA’s trade name) has finally come to fruition, keeping the name – and the attitude. As we have discussed, AMD’s Mantle API was crucial to pushing the industry in the correct and necessary direction for lower level APIs, though NVIDIA’s G-Sync deserves the same credit for recognizing and imparting the necessity of a move to a variable refresh display technology. Variable refresh displays can fundamentally change the way that PC gaming looks and feels when they are built correctly and implemented with care, and we have seen that time and time again with many different G-Sync enabled monitors at our offices. It might finally be time to make the same claims about FreeSync.
But what exactly is FreeSync? AMD has been discussing it since CES in early 2014, claiming that they would bypass the idea of a custom module that needs to be used by a monitor to support VRR, and instead go the route of open standards using a modification to DisplayPort 1.2a from VESA. FreeSync is based on AdaptiveSync, an optional portion of the DP standard that enables a variable refresh rate courtesy of expanding the vBlank timings of a display, and it also provides a way to updating EDID (display ID information) to facilitate communication of these settings to the graphics card. FreeSync itself is simply the AMD brand for this implementation, combining the monitors with correctly implemented drivers and GPUs that support the variable refresh technology.
A set of three new FreeSync monitors from Acer, LG and BenQ.
Fundamentally, FreeSync works in a very similar fashion to G-Sync, utilizing the idea of the vBlank timings of a monitor to change how and when it updates the screen. The vBlank signal is what tells the monitor to begin drawing the next frame, representing the end of the current data set and marking the beginning of a new one. By varying the length of time this vBlank signal is set to, you can force the monitor to wait any amount of time necessary, allowing the GPU to end the vBlank instance exactly when a new frame is done drawing. The result is a variable refresh rate monitor, one that is in tune with the GPU render rate, rather than opposed to it. Why is that important? I wrote in great detail about this previously, and it still applies in this case:
The idea of G-Sync (and FreeSync) is pretty easy to understand, though the implementation method can get a bit more hairy. G-Sync (and FreeSync) introduces a variable refresh rate to a monitor, allowing the display to refresh at wide range of rates rather than at fixed intervals. More importantly, rather than the monitor dictating what rate this refresh occurs at to the PC, the graphics now tells the monitor when to refresh in a properly configured G-Sync (and FreeSync) setup. This allows a monitor to match the refresh rate of the screen to the draw rate of the game being played (frames per second) and that simple change drastically improves the gaming experience for several reasons.
Gamers today are likely to be very familiar with V-Sync, short for vertical sync, which is an option in your graphics card’s control panel and in your game options menu. When enabled, it forces the monitor to draw a new image on the screen at a fixed interval. In theory, this would work well and the image is presented to the gamer without artifacts. The problem is that games that are played and rendered in real time rarely fall into a very specific frame rate. With only a couple of exceptions, games frame rates will fluctuate based on the activity happening on the screen: a rush of enemies, a changed camera angle, an explosion or falling building. Instantaneous frame rates can vary drastically, from 30, to 60, to 90, and force the image to be displayed only at set fractions of the monitor's refresh rate, which causes problems.
If a frame takes more than the standard refresh time of a monitor to draw (if the frame rate drops low) then you will see a stutter, or hitch, in the game play that is caused by the display having to re-draw the same frame for a second consecutive interval. Any movement that was tracking on the screen would suddenly appear to stop – and then would quickly “jump” to the next location faster than your mind thought it should. V-Sync also inherently introduces input latency to games if these lag and stutters take place.
The common alternative for gamers worried about latency and stutter was to disable V-Sync in the control panel or in game. This solves the stutter and latency issues (kind of) but cause another, much more noticeable issue, called "tearing". With V-Sync disabled, the graphics card sends a new frame to the monitor at any point in the monitors refresh cycle, even if the LCD is currently drawing. The result is that, at some point down the screen, the user sees the previous frame above, and the current frame below, at a sharp line. You will literally be seeing an image where the geometry is no longer lined up and, depending on the game and scene, can be incredibly distracting.
Monitors with refresh rates higher than 60 Hz reduce this tearing by having more frequent screen refreshes, and thus a tear is less likely to occur in any single refresh cycle, but the tearing is impossible to remove completely.
NVIDIA G-Sync (and FreeSync) switches things up by only having the monitor refresh its screen when a new frame is ready from the GPU. As soon as the next frame is drawn it can be based to the display and drawn on the screen without tearing. If the next frame is ready in 16ms, it can be sent immediately. If it takes 25ms or only 10ms, it doesn’t matter, the monitor will wait for information from the GPU to draw the new frame. The result is an incredibly smooth and fluid animation that doesn’t stutter and doesn’t tear.
So since we know what G-Sync provides, what does AMD FreeSync want to do differently? The company hangs its hat on three distinct keys: no proprietary hardware, no closed standards and no licensing fees. Let’s address the licensing fee first. I have it from people that I definitely trust that NVIDIA is not charging a licensing fee to monitor vendors that integrate G-Sync technology into monitors. What they do charge is a price for the G-Sync module, a piece of hardware that replaces the scalar that would normally be present in a modern PC display. It might be a matter of semantics to some, but a licensing fee is not being charged, as instead the vendor is paying a fee in the range of $40-60 for the module that handles the G-Sync logic.
That leads us to AMD’s claims that FreeSync doesn’t require proprietary hardware, and clearly that is a fair and accurate benefit of FreeSync today. Displays that support FreeSync can use one of a number of certified scalars that support the DisplayPort AdaptiveSync standard. The VESA DP 1.2a+ AdaptiveSync feature is indeed an open standard, available to anyone that works with VESA while G-Sync is only offered to those monitor vendors that choose to work with NVIDIA and purchase the G-Sync modules mentioned above.
Unfortunately, not all AMD Radeon graphics cards today will be able to run FreeSync monitors in their variable refresh state. For discrete GPUs, only the R9 290X, R9 290, R9 285, R7 260X and R7 260 have the ability to properly communicate with the FreeSync displays and offer users the advantages of VRR. If you own an HD 7000 series card or even an R9 280 or R9 270, you are going to need to upgrade your GPU in addition to your monitor to take advantage of the technology. For NVIDIA users though, G-Sync is supported on GTX 900-series, 700-series and 600-series cards nearly across the board, in part thanks to the implementation of the G-Sync module itself.
Let’s take a look at this AMD-created table above that compares FreeSync to G-Sync. We have already discussed the module requirement, open standards and licensing fees, but what about the other areas that AMD claims to hold the advantage? Because FreeSync monitors will use standard scalars from existing companies, they will have the full gamut of features that you expect in an LCD monitor. That includes audio output, internal scaling (for resolution changes), color processing, more input options and more. It’s been obvious when reviewing G-Sync monitors that some of these concerns stand out – only getting a single DisplayPort input is frustrating because there will likely be cases where you want to use your VRR capable monitor in a non-VRR configuration with an HDMI cable or DVI connection. You cannot do that with current generation G-Sync displays, it’s DisplayPort or nothing. FreeSync monitors will continue to offer a range of input options at the discretion of the display vendor.
The published refresh rate is more than a bit misleading. AMD claims that FreeSync is rated at 9-240 Hz while G-Sync is only quoted at 30-144 Hz. There are two issues with this. First, FreeSync doesn’t determine the range of variable refresh rates, AdaptiveSync does. Second, AMD is using the maximum and minimum refresh range published by the standards body, not an actual implementation or even a planned implementation from any monitor or display. The 30-144 Hz rating for G-Sync is actually seen in shipping displays today (like the ASUS PG278Q ROG Swift). The FreeSync monitors we have in our office are either 40-144 Hz (TN, 2560×1440) or 48-75 Hz (IPS, 2560×1080); neither of which is close to the 9-240 Hz seen in this table.
We’ll touch on the implications of a performance degradation on G-Sync later in the story.
Glad to see the current state
Glad to see the current state of things with these, hopefully both AMD and Nvidia are able to improve their technologies based on feedback such as this. Thanks for all the time you guys put into these stories!
Allyn am I missing
Allyn am I missing something…? Why doesn’t the GPU just downclock to avoid going over the high ceiling of the VRR window?
It would have been nice if
It would have been nice if you had played some gaming on your “Gaming with freesync” test. It looks like the only game played was the Windmill tech demo game. That must have been awesome.
Seriously though, White on Black, super high contrast, perfect stress test for ghosting. I’d like to have seen some real world testing, and some actual thoughts on real gaming performance.
What turns out te be is that
What turns out te be is that Benq and Acer updated the firmware and now overdrive can be used in conjunction with FreeSync, just like G-sync. The FreeSync technology had NOTHING to do with ghosting.
TOTALLY UNACCEPTABLE THAT PC PERSPECTIVE IS GIVING AMD’s FREESYNC A BAD NAME!
RECTIFY THIS ARTICLE!
Are you guys going to retest
Are you guys going to retest with the firmware fix wich wasn’t using the overdrive with Display port. it’s now been fixed on with a Benq XL2730Z firmware update. The version starts with V002 i’ve updated mine and only after the update did the overdrive work even with freesync off. That is what was causing the ghosting.
The problem with this monitor
The problem with this monitor is the low 48 cut off point. And this is what makes the monitor extremely limited useful for people that seek freesync, the 75hz freesync cutoff isn’t really the issue here.
This means you will need a minimum of atleast 50 fps in games realisticly.
In most games with amd cards fps fluctuate heavily, the higher the fps the more fluctuation is going on. You can drop from 100 fps to 60 fps for example, or from 70 fps into low 40’s.
In order to get in metro 2033 a stable 50 fps on the screens resolution ( 290 1250/1450 oc’ed faster then 390x speed ), you will have to drop the settings to medium or push dx9 + very high settings vs dx11 medium to get a stable 50 fps, trade-offs that are not worth it in my opinion. Specially when freesync an gsync really would shine on 30-60 fps range, but is pretty much not used with this monitor.
Because why let it push out of freesync and get tearing while you buy the screen specifically for the no tearing part, the 50 minimum will be needed.
another example:
Witcher 3 has to result into a mix of medium/high/cut down ultra settings to get a rock solid 50 fps.
In my vision, the games that really shine on this screen will be games like path of exile, mmo’s, or diablo or league of legends etc. The tearing is good noticable and 48 fps in those games are not really hard to hit at all, with locking the fps on 75 fps and probably getting 60+ lows it will surely be a solid and tear free experience with a lot more screen space to work with.
But for new games, this monitor simple isn’t going to work.
Just bought a benq xl2730z
Just bought a benq xl2730z freesync on or off doesn’t change my screen tearing at all.. bad monitor?
BenQ GW2255 21.5 inch LED
BenQ GW2255 21.5 inch LED Backlit LCD Monitor comes in Full HD display with VGA & DVI-D connection. I purchase this model of monitor from online shopping because it is compatible with Windows 7 and 8. I delivered this product after 3 working days of order. This model of monitor is really amazing which featured with highly advanced technology. If you are looking for purchasing a new monitor then you can select BenQ GW2255 21.5 inch LED Backlit LCD Monitor. For complete details visit – http://amzn.to/1L76PXG
Those guidelines additionally
Those guidelines additionally worked to become a good way to recognize that other people online have the identical fervor like mine to grasp great deal more around this condition
Data Science Training in Bangalore
Thanks for the Information
Python training in Hyderabad
Nice article
Cybersecurity course in Hyderabad