It finally happened – later than I had expected – we got to get hands on with nearly-ready FreeSync monitors! That's right, AMD's alternative to G-Sync will bring variable refresh gaming technology to Radeon gamers later this quarter and AMD had the monitors on hand to prove it. On display was an LG 34UM67 running at 2560×1080 on IPS technology, a Samsung UE590 with a 4K resolution and AHVA panel and BenQ XL2730Z 2560×1440 TN screen.
The three monitors sampled at the AMD booth showcase the wide array of units that will be available this year using FreeSync, possibly even in this quarter. The LG 34UM67 uses the 21:9 aspect ratio that is growing in popularity, along with solid IPS panel technology and 60 Hz top frequency. However, there is a new specification to be concerned with on FreeSync as well: minimum frequency. This is the refresh rate that monitor needs to maintain to avoid artifacting and flickering that would be visible to the end user. For the LG monitor it was 40 Hz.
What happens below that limit and above it differs from what NVIDIA has decided to do. For FreeSync (and the Adaptive Sync standard as a whole), when a game renders at a frame rate above or below this VRR window, the V-Sync setting is enforced. That means on a 60 Hz panel, if your game runs at 70 FPS, then you will have the option to enable or disable V-Sync; you can either force a 60 FPS top limit or allow 70 FPS with screen tearing. If your game runs under the 40 Hz bottom limit, say at 30 FPS, you get the same option: V-Sync on or V-Sync off. With it off, you would get tearing but optimal input/display latency but with it off you would reintroduce frame judder when you cross between V-Sync steps.
There are potential pitfalls to this solution though; what happens when you cross into that top or bottom region can cause issues depending on the specific implementation. We'll be researching this very soon.
Notice this screen shows FreeSync Enabled and V-Sync Disabled, and we see a tear.
FreeSync monitors have the benefit of using industry standard scalers and that means they won't be limited to a single DisplayPort input. Expect to see a range of inputs including HDMI and DVI though the VRR technology will only work on DP.
We have much more to learn and much more to experience with FreeSync but we are eager to get one in the office for testing. I know, I know, we say that quite often it seems.
PC Perspective's CES 2015 coverage is sponsored by Logitech.
Follow all of our coverage of the show at https://pcper.com/ces!
you sure the LG 34um67 is a
you sure the LG 34um67 is a top freq of 60hz? AMD press release says 75hz…
Thanks for the first
Thanks for the first impressions guys. I wanted to see if I should wait for freesync before focusing on GSync. Turning vsync on at a frame rate above or below the variable refresh rate window is a deal breaker for me because input lag and stuttering is not going away with this solution if your fps is not always in the VRR window.
Also can you also remind me what Gsync does if your fps is below or above the VRR window?
If what you say about freesync is true and Gsync has a better solution, then I think the only thing I have to wait is for a good IPS 144Hz Gsync monitor.
Also can you also remind me
Also can you also remind me what Gsync does if your fps is below or above the VRR window?
I’m pretty sure g-sync does the same, above the RR it v-syncs and below I think it lets go or v-syncs as well.
I think Nvidia did mentioned
I think Nvidia did mentioned in pne of the slide aboit it. In that condition it changes the refresh rate of the screen so stay with the fps. That way it still remains constant latency and you get whatever your fps is.
if screen refresh rate is no
if screen refresh rate is no longer equal to fps below 30fps. THEN YOU ARE NOT GETTING “WHATEVER YOUR FPS IS”
Well there’s 2 options..
Well there’s 2 options.. either it uses V-Sync (with double buffering of course.. an on-the-fly switch to triple buffering is not possible since that buffer would have to be written first) meaning from then on you play at 15 FPS but without tearing (like that’s gonna help) or it disables everything and you’ll get tearing but your proper FPS.
Thinking about this again I
Thinking about this again I think a seamless transition to triple buffering V-sync below the minimum should actually be possible.. my bad.
The same thing would be true for Freesync however.
I’m not sure what you think
I’m not sure what you think GSync is doing in the same situation. It can’t magically make a monitor refresh faster or slower than it is capable of.
If you have an up to 60Hz panel and 80fps, you are either going to wait for the next refresh to show a full frame, or you’re going to have tearing, there’s no way around it.
Seems like with Freesync you get to make that choice for yourself, while GSync it seems the decision is made for you.
This is probably why most of the GSync monitors are 144hz – less likelihood of going over that, and less waiting time when you do go over. And a larger VRR window of course.
And you’d be wrong, G-Sync
And you’d be wrong, G-Sync does handle this better because at no point do you need to enable V-Sync to avoid tearing above max refresh, thus undoing much of the benefit of this tech to begin with.
At higher than monitor refresh rates, the monitor continues to update at its max refresh rate with no input lag, no tearing because the G-Sync module with the onboard DRAM (y’know, the same magic stuffs AMD and their fanboys thought was extraneous and unnecessary) actively compares, holds, and renders each frame using it as a lookaside buffer.
So, any frames above what the monitor can refresh, the G-Sync module holds, compares, and chooses to display new, hold or throws out.
I guess all that pricey, proprietary hardware was justified after all! 😀
But yes it is nice to see AMD finally show us something with FreeSync beyond slidedecks, fixed refresh windmills and empty promises, but now they have shown it, they have also confirmed what many of us expected:
G-Sync isn’t going anywhere, its better than FreeSync and wehatever premium Nvidia is charging will be justified.
Above the max refresh rate
Above the max refresh rate G-Sync behaves like V-Sync with double buffering (which doesn’t suffer any stuttering or the more pronounced latency issues of triple buffering) enabled… the rate of pictures drawn cannot exceed the maximum refresh rate anyway and if the display is not ready for the next refresh the GPU will have to wait with the already rendered image until it can display it.. meaning by the time this image reaches the monitor it will already be outdated by a couple miliseconds…
If I disable this behaviour the image will be drawn exactly when it’s ready.. meaning that I’ll get tearing, but I’ll get a glimpse of the next image before it *should* actually be displayed (which would be on the next refresh)
Now this may or may not be a tiny advantage in fast paced online shooters .. I wouldn’t know… but it doesn’t hurt to leave the choice up to the user.
Sorry, this is incorrect.
Sorry, this is incorrect. Where did you read this?
G-Sync above max refresh has the option to hold a frame and wait for the next frame after monitor refresh, effectively reducing the actual frame rate output of the monitor, because the G-Sync module is dynamically changing and controlling the monitors refresh rate. Unlike typical refresh and V-sync, where the GPU is still slave to the monitor’s refresh and the next frame must be rendered, regardless of how old it is, based on the timing of the monitor’s refresh.
So in a worst case scenario, with uneven frametimes, you might see 6-7ms of input lag/latency on a 120Hz monitor (8.33ms between frame refresh).
Also the GPU can only discard
Also the GPU can only discard if it’s rendering twice as fast as the panel refresh.. everything else will have to be put on hold and then displayed. Same behaviour as V-sync with double buffering, which always halves the refresh rate if the full rate cannot be achieved.
Again, incorrect, the onboard
Again, incorrect, the onboard DRAM on the G-Sync PCB acts as a lookaside buffer that allows the G-Sync module to compare, hold, and send frames to display. 768MB is capable of storing a whole lot of 4-20MB frames.
All that awesome knowledge Tom Petersen dumped on us in interviews on this very site starting to pay off!
Guess that G-Sync module and DRAM wasn’t extraneous and unnecessary after all!
Spot on!
Spot on!
gsync is effectively vsync
gsync is effectively vsync on, so you max out at 144hz and when you go below 30 fps the screen refreshes at 144hz and the frame rate is then vsynced onto the display. Because its very high refresh rate its not as big of a concern and of course once you go below 30 fps the problem this causes is less important because its already an unplayable frame rate anyway.
But with some of the Freesync monitors having minimums of 40 its much more of a concern what they choose to do and whether than are vsyncing especially if the peak is only 60hz, then the stutter introduced will be more than double of that of the gsync 144hz monitors.
Then it would be a good idea
Then it would be a good idea to buy a gaming monitor with a wide range.. such as 30Hz-144Hz as presented in the video.. don’t you think?
you only get the stuttering
you only get the stuttering and high latency with triple buffering .. and above the max refresh rate you cannot get the stutter at all, regardless of the number of buffers
i doubt that’s what’s being used here since you can’t just start writing a third buffer whenever the FPS exceed the refresh rate, that would create a transition time
also there would be zero advantage to using triple buffering v-sync above the maximum refresh rate since it would behave exactly like double buffering but with added latency
Thanks for the video Ryan.
Thanks for the video Ryan. Most Nvidia fanboys all over the internet, payed or not, lately have been saying literally that Freesync doesn’t really exists, or that it is buggy and not going to be supported. Funny but also true.
And how you will get FreeSync
And how you will get FreeSync monitor? AMD fanboys will pay for free open standard.
Oh, and Tomb Raider running without TressFX.
No open source Mantle as promised.
AMD fail.
Here we go!!!
Stupidity at
Here we go!!!
Stupidity at it’s finest. As a brave anonymous post.
No, most Nvidia fans figured
No, most Nvidia fans figured once AMD got around to finally showing something it would be half-baked and worst than what Nvidia already brought to market, as was the case so many times before. And we were right.
Back to enjoying G-Sync, anyone who didn’t bother waiting on AMD FreeSync can certainly feel better in their decision making as a result of this demonstration.
I watched a guy fix a foreign
I watched a guy fix a foreign transmission oncet!
Also,… just put up a global
Also,… just put up a global fps cap 1fps below the max refresh rate and YOU ARE GOOD FOREVER.
Never experiencing vsync and never seeing any tearing, as long as you stay above 40fps, which you want to do ANYWAY.
Load of bullcrap.
As long as
Load of bullcrap.
As long as there’s no syncing happening you *will* see tearing, no matter what your refresh rate or FPS are at.. educate yourself on the issue before making tall claims.
I think he was referring to
I think he was referring to if you are using a Freesync (or even G-sync, really) monitor, as a method to avoid the issue of going higher than the FPS window.
Thanks for digging up the
Thanks for digging up the additional details on how Freesync will work 🙂 I’m really interested to see whether and how the addition of VSync will compromise the experience inside the “Freesync-Window”, or whether traditional VSync will actually be disabled inside the window and only be activated below 30/40Hz and above the 60/144Hz max refresh rate (effectively acting as an fps cap at max.)
If VSync is not disabled inside the “Freesync-Window”, I see two possibilities as usual: double vs. triple buffered VSync.
With double buffering, actual frames rendered are a divider of the max refresh rate, which would enable the screen to actually still refresh at, for example, its max rate – which then would allow for black frame insertion on the BenQ Z for example 😀 Blur reduction in this manner is normally not available for logical reasons in conjunction with Freesync / GSync variable VBlank, but with double buffering VSync it still could be.
With triple buffering, the screen refresh rate can be tied to the actual frames rendered thanks to Freesync, just as it could with Freesync without any additional VSync. But you may occur a latency penalty – which would make triple buffering actually worse than double buffering in some ways in these hypothetical scenarios. Or there is the possibilty “triple buffering VSync + Freesync” would mean that the panel still refreshes at its max. refreshrate for some unoptimized reason. And that then could introduce judder as mentioned in the article. Hm…
So again, I’m really interested in an in an actual thorough investigation in how the combination VSync + Freesync works in practice, not only educated guesses or claims, such as the judder mentioned in the article seems to be.
If you read some other
If you read some other comments before posting you will see that you are incorrect and that the only issue that there is can be solved with a simple fps cap.
wtf with 4k having a min of
wtf with 4k having a min of 40fps is even more pointless that 30 with gsync
So, FreeSync works only in
So, FreeSync works only in the range where I don’t see a ton of tearing anyway? :-/
Seems like this tech will mostly be good for 4k then, to smooth out frame variances.
I don’t know if I understand here – if you have to turn on VSYNC, does that mean you get the input lag associated with VSYNC throughout your experience, or only at the max refresh rate? A good example of a game with a radical amount of input lag with VSYNC would be Titanfall. I wonder how this reacts.
Which is exactly how Gsync
Which is exactly how Gsync works too. Only there you don’t have the option to disable Vsync, it’s always on.
I think what a lot of people
I think what a lot of people are overlooking, or maybe just ignoring, is that these are all early models. Don’t think for a moment that these first models are what all future models will go by.
While I do want one of these VRR monitors, whether it is either flavor doesn’t matter, i’m more interested in Gen 2 or 3 of them and just how much better those are as compared to what early adopters get, meaning i’m going to wait.
I think those 40-60Hz
I think those 40-60Hz monitors are actually using old display controllers that only had firmware updated to support freesync. AMD said half a year ago number of existing monitors can support freesinc with only some firmware updates, and they even had demo of one such monitor at the time.
I expect actual purpose built freesync capable controllers designed after adaptation of standard will support much higher ranges then 40-60Hz, much like that 30-144Hz already supports.
hahahah “freesync”
hahahah “freesync” 40hz-60hz……Gsync 30hz-144hz
no brainer
You are so very wrong, did
You are so very wrong, did you even bother to watch the video? At about 1:10 into it he introduces the BenQ XL2730Z 2560×1440 running at 30hz – 144hz.
you are a no brainer!
sure did. “free”sync only
sure did. “free”sync only goes down to 40hz and up to 75hz
no brainers. Dont get the monitors refresh capabilities confused with “free”sync buddy
You are completely wrong
You are completely wrong h1tman. Freesync can go from 9Hz to 240Hz, as long as the monitor can.
LOL, I’m afraid you are the
LOL, I’m afraid you are the one confused, or just refuse to understand how both Gsync and freesync works.
Both of these technologies work in a wide range of refresh capabilities. What this means is that both tech is LIMITED to what the monitor is capable of doing. If the monitor maker makes their monitor to work only in the range of 30 – 60hz, 40 – 60hz, or even 30 – 144hz, then that is the range it is limited to and not some random range you came up with.
The three monitors that they looked at have the following ranges:
BenQ – 30 – 144hz
Samsung 40 – 60hz
LG – 40 – 60hz
What this means is that freesync will work in the range of 30 – 144hz on the BenQ monitor, and from 40 – 60hz on each of hte other two monitors.
And furthermore, not all gsync monitors operate in the range of 30 – 144hz, there are some new 4k that are limited from 30 – 60hz.
Seriously, you need to do better research before commenting.
Wrong. The LG actually has
Wrong. The LG actually has 40-75hz
Ryan was MISTAKEN!
hes a fucking troll, … why
hes a fucking troll, … why do you reply to a retard like that?
I find it surprising about
I find it surprising about this implementation of freesync how AMD chose to deal with situation when GPU is running at higher frequencies then monitor supports. I expected they will use their already established DFC (Dynamic Frame Rate Control) to choke speed of GPU to maximum rate of monitor. This would allow for power savings, as well passably reduce ventilator and coil nose.
http://www.radeonpro.info/features/dynamic-frame-rate-control/
http://wccftech.com/amds-reveals-gpu-power-scaling-technology-dfrc-dynamic-frame-rate-control/
Instead AMD chose to use V-sync to deal with high GPU frequencies, and I think that is less then desirable choice form heat, noise and latencies perspective.
Personalty my advice to radeon and freesync users is to disable V-sync option when using freesync, and use AMD DFC technology to simply cap out their GPU to maximum frequency their freesync monitor supports.
AMD might have chosen this
AMD might have chosen this method of leaving it up to the user due to the issues Gsync has with some older games or powerful enough setups that can exceed panel refresh rate. Gsync is vsync on all the time below or above refresh window.
Once you go above G-Sync refresh rate you experience latency issues so a FPS cap below panel max is suggested. with freesync the user has the option for that same experience with vsync on or to run it with vsync off. Once you go pass the panel threshold you wont experience latency.
You wont see much tearing because the monitor isn’t showing every single frame at that point. Like Gsync users experience stutter step the same will happen but you will have the option of disabling the latency that comes with that stutter step if your system is exceeding monitor max refresh rate.
The gsync module but in this case freesync wont be creating a back clog like Tom Petersen said when it goes beyong max monitor refresh when you choose to run it with vsync off.
Ye, exactly.
LG is a max 75hz
Ye, exactly.
LG is a max 75hz screen. Download rivatuner and put up a global fps cap of 74fps.
And you WILL NEVER encounter vsync. You will have 100% lag/stutter/tearing and claustrophobia (since its 21:9) free gaming for EVER.
If your fps drops below 40 then you wont see flicker like on gsync, but you will just see massive stutters.
But let be honest… fast fps dips we can deal with, but noone really should play below 40fps.
I also suspect that the
I also suspect that the variable refresh of the panels is determined by the integrity the manufactures want to provide. The more you overdrive a panel to get lower response time the more you screw up backlighting and color integrity.
Gamers like TN panels because they don’t care much for image or color quality in a monitor.
You can see and hear JJ explain it at 2:30+ in this video
https://www.youtube.com/watch?v=yTQMdsLFj8M
I think its a smart move to have a certification process and leave the refresh rate window to the vendor. A Samsung or LG is less likely to want to provide a IPS or PLS panel with high overdrive making the visual experience aweful when its active then when its not. Where as a ASUS, BenQ or Acer is more comfortable overdriving their panels higher to get lower response time at the cost of image/color quality.
When Gsync is enabled, there
When Gsync is enabled, there is no Vsync. its disabled through the nvidia control panel. and the card is throttled just enough to stay a few frames away from the refresh rate of the monitor. so you are permanently in the range of gsync. id know, i have an ROG swift.
When Gsync is enabled, there
When Gsync is enabled, there is no Vsync. its disabled through the nvidia control panel. and the card is throttled just enough to stay a few frames away from the refresh rate of the monitor. so you are permanently in the range of gsync. id know, i have an ROG swift.
No you don’t.
The card isn’t
No you don’t.
The card isn’t throttled its driver cap locked 1hz below max. If your going over that 144hz your creating a back-log on the look-a-side buffer Tom Petersen was talking about.
EXACTLY! I have the swift and
EXACTLY! I have the swift and the Acer Gsync monitor. GYSNC true game changer forever!
Does I wants VHS or Beta,
Does I wants VHS or Beta, paper or plastic, I’ll think I wait till the dust settles, and the monitor manufactures get the kinks worked out of the DP standard, and V-sync of the less costly variety. Gaming Monitors should come with enough buffering memory, to give the GPU hardware enough time to tweak things on the fly, while canning all this proprietary stuff. Let the Free, and G sync fight continue, but display monitors need to be GPU agnostic or things will just continue to drain gamers of more money. Start building monitors with PCI/other standard slots, for the market, and let the GPU maker/s supply their proprietary hardware in pluggable card form, but for sure at least have a no/little extra cost version of V-sync built into all LCD displays, and the DP standard, that all GPUs have to be able to support, in addition to their proprietary stuff.
So fun to see chizow
So fun to see chizow sweating, and all the lies he’s writing in stress 😀
I don’t like either of these
I don’t like either of these solutions. The interface to the display should not be proprietary. The refresh rate is pretty basic part of the display standard, right? It seems to me that if you make panels self refreshing, then most of these issues just go away. If the panel is designed to run at 144 Hz, then it is best for it to run at 144 Hz all of the time, from it’s own buffer. You would need 2 buffers or some other “clever” circuitry to avoid artifacts from partial frames though.
Since g-sync includes a buffer of some kind, why can’t it display at the panels preferred refresh rate all of the time? If the gpu is not producing frames fast enough, then g-sync seems to drop down to minimum frame rate refresh rather than the preferred or max refresh rate. Why? If you have the frame in a local buffer, it seems you could drive it at whatever refresh rate you want. A fully self refreshing panel seems like it would resolve the g-sync issue of having only a single DP input also. The buffer could be at the display side of the input circuitry, so it could take input from any source. The only difference would be that some sources (DP or whatever) could allow asynchronous/variable frame updates. It was my understanding that panel self refresh was desirable anyway for power savings in mobile; everything but the panel can be shut down when you are sitting at a static screen (like reading a web page).
You don’t seem to understand
You don’t seem to understand the issues.
For example, you say “if the GPU is not producing frames fast enough, then g-sync seems to drop down to a minimum…”
No.
When G-Sync is working the GPU outputs a new frame THEN the monitor displays that a very small time later (rather than at a fixed interval).
G-Sync doesn’t work properly below 30Hz for example due to a limitation in current panel technology which we won’t discuss.
There are several PROBLEMS which G-Sync fixes.
Screen tearing – happens if you can’t synch monitor and GPU (because normal monitors update at a FIXED interval)
LAG – VSYNC is used to fix screen tearing but this causes lag because there’s a delay caused by buffering the GPU output to match the monitors update cycle
STUTTER – happens if you have VSYNC on but are outputting below the refresh rate
*So you can fix screen tearing by enabling VSYNC but then get lag (or lag and stutter) or disable VSYNC but get screen tearing.
G-SYNC fixes all these issues AT THE SAME TIME. The only real issue is staying above 30FPS which isn’t a huge deal and even that will be fixed with newer PANELS (a panel limitation, not a G-Sync limitation).
**Above max?
I believe this is where G-SYNC is superior but it’s hard to confirm. It’s my understanding that G-Sync’s added fast memory as a “lookaside” buffer allows G-Sync to stay enabled when the GPU can output more than the monitor’s maximum.
Thus the monitor still updates ONLY as directed by the GPU software which keeps things SMOOTH. So you can basically stay capped at 60FPS on a 60Hz monitor this way.
FREE-SYNC however as a limitation of the monitor hardware (no proprietary hardware with lookaside buffer) seems forced to DISABLE the asynchronous method and go back to the normal fixed update by the monitor.
This is going to be a HUGE ISSUE for Free-Sync. So you have to stay in the RANGE to make it work (i.e. above 40FPS or below 60FPS). Not as big a deal for 30FPS up to 144FPS.
So basically G-SYNC seems to work almost perfectly and Free-Sync is problematic especially on 60Hz monitors.
(Worse is playing a game with 60FPS average on a 60Hz monitor. You’d keep going above and below the 60FPS mark meaning FREE-SYNC would turn on and off.)
Don’t flame me if this is incorrect, but I’ve done a lot of reading and I think it’s correct.
(Also not sure how AMD can fix this without introducing a proprietary scaler with lookaside buffer like NVidia’s solution.)
I think I understand the
I think I understand the issues quite well. If you do not want tearing, then you can not modify or swap buffers in the middle of a scan. If you are running at 30 Hz on a display capable of 144 Hz, then I assume that the display will scan very quickly, and then just do nothing for the rest of the 1/30th of a second. If the gpu produces another frame, then it can display it right away on a gsync display, so the refresh rate is no longer 30 Hz. The display refresh rate is still in sync with the gpu refresh rate, it is just not a fixed value. If you go over 144 Hz, then some frames will get dropped to avoid tearing. This is the case for both display techniques. At 144 Hz, you might as well drop back into a vsync type mode. There isn’t much point in producing frames faster than they can be displayed since they will just be dropped.
My point is, gsync has a mode of operation where it will refresh the display from a buffered frame if a new one is not available (<30 Hz from gpu). Only doing this below 30 Hz seems to cause issues, especially on 144 Hz displays where the pixel value decays very fast (decays towards white rather than black as on a CRT). Since you have a buffer on board, why not run asynchronously and scan out of it at the displays maximum (or "preferred") refresh rate independent of what you are getting from the gpu? You cannot interrupt in middle of a frame regardless of refresh rate without causing tearing, so this does not seem to cause much added latency. If you drop to low in frame rate, it will still turn into a slide show; 24 fps is about the minimum to maintain the illusion of motion. Scanning at maximum would still maintain image integrity though, since you are scanning fast enough to avoid decaying pixel values.
I don't know if anyone will read comments on this old of story. Your post does not answer my question at all. Gsync has the ability to scan out of the on-board buffer and therefore run asynchronously at <30Hz. Why not just do this all of the time? Perhaps it makes low frame rates from the gpu more noticeable? Some other implementation detail? It doesn't seem like running asynchronously is a problem if you are running at a high refresh rate on the display side.
You don’t seem to understand
You don’t seem to understand the issues.
For example, you say “if the GPU is not producing frames fast enough, then g-sync seems to drop down to a minimum…”
No.
When G-Sync is working the GPU outputs a new frame THEN the monitor displays that a very small time later (rather than at a fixed interval).
G-Sync doesn’t work properly below 30Hz for example due to a limitation in current panel technology which we won’t discuss.
There are several PROBLEMS which G-Sync fixes.
Screen tearing – happens if you can’t synch monitor and GPU (because normal monitors update at a FIXED interval)
LAG – VSYNC is used to fix screen tearing but this causes lag because there’s a delay caused by buffering the GPU output to match the monitors update cycle
STUTTER – happens if you have VSYNC on but are outputting below the refresh rate
*So you can fix screen tearing by enabling VSYNC but then get lag (or lag and stutter) or disable VSYNC but get screen tearing.
G-SYNC fixes all these issues AT THE SAME TIME. The only real issue is staying above 30FPS which isn’t a huge deal and even that will be fixed with newer PANELS (a panel limitation, not a G-Sync limitation).
**Above max?
I believe this is where G-SYNC is superior but it’s hard to confirm. It’s my understanding that G-Sync’s added fast memory as a “lookaside” buffer allows G-Sync to stay enabled when the GPU can output more than the monitor’s maximum.
Thus the monitor still updates ONLY as directed by the GPU software which keeps things SMOOTH. So you can basically stay capped at 60FPS on a 60Hz monitor this way.
FREE-SYNC however as a limitation of the monitor hardware (no proprietary hardware with lookaside buffer) seems forced to DISABLE the asynchronous method and go back to the normal fixed update by the monitor.
This is going to be a HUGE ISSUE for Free-Sync. So you have to stay in the RANGE to make it work (i.e. above 40FPS or below 60FPS). Not as big a deal for 30FPS up to 144FPS.
So basically G-SYNC seems to work almost perfectly and Free-Sync is problematic especially on 60Hz monitors.
(Worse is playing a game with 60FPS average on a 60Hz monitor. You’d keep going above and below the 60FPS mark meaning FREE-SYNC would turn on and off.)
Don’t flame me if this is incorrect, but I’ve done a lot of reading and I think it’s correct.
(Also not sure how AMD can fix this without introducing a proprietary scaler with lookaside buffer like NVidia’s solution.)
You don’t seem to understand
You don’t seem to understand the issues.
For example, you say “if the GPU is not producing frames fast enough, then g-sync seems to drop down to a minimum…”
No.
When G-Sync is working the GPU outputs a new frame THEN the monitor displays that a very small time later (rather than at a fixed interval).
G-Sync doesn’t work properly below 30Hz for example due to a limitation in current panel technology which we won’t discuss.
There are several PROBLEMS which G-Sync fixes.
Screen tearing – happens if you can’t synch monitor and GPU (because normal monitors update at a FIXED interval)
LAG – VSYNC is used to fix screen tearing but this causes lag because there’s a delay caused by buffering the GPU output to match the monitors update cycle
STUTTER – happens if you have VSYNC on but are outputting below the refresh rate
*So you can fix screen tearing by enabling VSYNC but then get lag (or lag and stutter) or disable VSYNC but get screen tearing.
G-SYNC fixes all these issues AT THE SAME TIME. The only real issue is staying above 30FPS which isn’t a huge deal and even that will be fixed with newer PANELS (a panel limitation, not a G-Sync limitation).
**Above max?
I believe this is where G-SYNC is superior but it’s hard to confirm. It’s my understanding that G-Sync’s added fast memory as a “lookaside” buffer allows G-Sync to stay enabled when the GPU can output more than the monitor’s maximum.
Thus the monitor still updates ONLY as directed by the GPU software which keeps things SMOOTH. So you can basically stay capped at 60FPS on a 60Hz monitor this way.
FREE-SYNC however as a limitation of the monitor hardware (no proprietary hardware with lookaside buffer) seems forced to DISABLE the asynchronous method and go back to the normal fixed update by the monitor.
This is going to be a HUGE ISSUE for Free-Sync. So you have to stay in the RANGE to make it work (i.e. above 40FPS or below 60FPS). Not as big a deal for 30FPS up to 144FPS.
So basically G-SYNC seems to work almost perfectly and Free-Sync is problematic especially on 60Hz monitors.
(Worse is playing a game with 60FPS average on a 60Hz monitor. You’d keep going above and below the 60FPS mark meaning FREE-SYNC would turn on and off.)
Don’t flame me if this is incorrect, but I’ve done a lot of reading and I think it’s correct.
(Also not sure how AMD can fix this without introducing a proprietary scaler with lookaside buffer like NVidia’s solution.)
@PCPER crew, especially RYAN,
@PCPER crew, especially RYAN, There has been an awful lot of commenting and posting about the ways that people THINK freesync and gysnc works. Could you guys please do an article fully explaining the similarities and differences between the technologies? I would have suggested just updating this article or a similar one but this is a big topic that really needs to be fleshed out so everyone else will stop spouting misinformation.
When you do the article, it would be most helpful to include the following bullet points:
* what happens when either ‘sync’ goes below or above the panel’s VRR window. Include information on exactly how vsync is used(or not used) by either ‘sync’ tech when going above or below the VRR window of that monitor.
* explain how either ‘sync’ is limited to the refresh rate range of the monitor, and that just because a monitor is made to operate from 30 – 60hz, that this in no ways means that all ‘sync’ monitors are limited to that, just that particular monitor model is.
* explain the actual range of each ‘sync’ can go from and to, ie, freesync is limited from 9hz to 240hz
*etc
I don’t really want any random commenter to reply to this with what they THINK they know, the idea here is to have a reliable real source, which is why we come to PCPer in the first place, right?
Freesync superior to Gsync
Freesync superior to Gsync nuff said
Uh, no. GSYNC is far superior
Uh, no. GSYNC is far superior in every way. But don’t you worry. You can go ahead and buy an inferior crappy FreeSync monitor and enjoy all of the worst default screen tearing and stuttering. LMAO!!