It’s more than just a branding issue
Combine Ryan, Allyn and an analog oscilloscope – what do you get? SCIENCE!
As a part of my look at the first wave of AMD FreeSync monitors hitting the market, I wrote an analysis of how the competing technologies of FreeSync and G-Sync differ from one another. It was a complex topic that I tried to state in as succinct a fashion as possible given the time constraints and that the article subject was on FreeSync specifically. I'm going to include a portion of that discussion here, to recap:
First, we need to look inside the VRR window, the zone in which the monitor and AMD claims that variable refresh should be working without tears and without stutter. On the LG 34UM67 for example, that range is 48-75 Hz, so frame rates between 48 FPS and 75 FPS should be smooth. Next we want to look above the window, or at frame rates above the 75 Hz maximum refresh rate of the window. Finally, and maybe most importantly, we need to look below the window, at frame rates under the minimum rated variable refresh target, in this example it would be 48 FPS.
AMD FreeSync offers more flexibility for the gamer than G-Sync around this VRR window. For both above and below the variable refresh area, AMD allows gamers to continue to select a VSync enabled or disabled setting. That setting will be handled as you are used to it today when your game frame rate extends outside the VRR window. So, for our 34UM67 monitor example, if your game is capable of rendering at a frame rate of 85 FPS then you will either see tearing on your screen (if you have VSync disabled) or you will get a static frame rate of 75 FPS, matching the top refresh rate of the panel itself. If your game is rendering at 40 FPS, lower than the minimum VRR window, then you will again see the result of tearing (with VSync off) or the potential for stutter and hitching (with VSync on).
But what happens with this FreeSync monitor and theoretical G-Sync monitor below the window? AMD’s implementation means that you get the option of disabling or enabling VSync. For the 34UM67 as soon as your game frame rate drops under 48 FPS you will either see tearing on your screen or you will begin to see hints of stutter and judder as the typical (and previously mentioned) VSync concerns again crop their head up. At lower frame rates (below the window) these artifacts will actually impact your gaming experience much more dramatically than at higher frame rates (above the window).
G-Sync treats this “below the window” scenario very differently. Rather than reverting to VSync on or off, the module in the G-Sync display is responsible for auto-refreshing the screen if the frame rate dips below the minimum refresh of the panel that would otherwise be affected by flicker. So, in a 30-144 Hz G-Sync monitor, we have measured that when the frame rate actually gets to 29 FPS, the display is actually refreshing at 58 Hz, each frame being “drawn” one extra instance to avoid flicker of the pixels but still maintains a tear free and stutter free animation. If the frame rate dips to 25 FPS, then the screen draws at 50 Hz. If the frame rate drops to something more extreme like 14 FPS, we actually see the module quadruple drawing the frame, taking the refresh rate back to 56 Hz. It’s a clever trick that keeps the VRR goals and prevents a degradation of the gaming experience. But, this method requires a local frame buffer and requires logic on the display controller to work. Hence, the current implementation in a G-Sync module.
As you can see, the topic is complicated. So Allyn and I (and an aging analog oscilloscope) decided to take it upon ourselves to try and understand and teach the implementation differences with the help of some science. The video below is where the heart of this story is focused, though I have some visual aids embedded after it.
Still not clear on what this means for frame rates and refresh rates on current FreeSync and G-Sync monitors? Maybe this will help.
This graph shows typical (and most popular) 40-144 Hz panel implementations of each technology and the relationship between frame rate and refresh rate. The bottom axis shows the game's frame output rate, what would be reported by a program like Fraps. You can see that from ~40 FPS to 144 FPS, both technologies offer pure variable frame rate implementations where the refresh rate of the screen matches the game's frame rate. The quality and experience between the two technologies here are basically identical (and awesome). Above 144 FPS, both will go into a V-Sync state (as long as V-Sync is enabled on the FreeSync panel).
Below that 40 FPS mark though things shift. The red line shows how AMD's FreeSync and Adaptive Sync work: the refresh rate stays static at 40 Hz even as the frame rate dips below 40 FPS, to 35 FPS, 30 FPS, etc. G-Sync works differently, doubling the refresh rate and inserting duplicate frames starting at around the 37 FPS mark. This continues until the game frame rate hits 19/20 FPS where a third frame is inserted and the refresh rate is increased again. The result is the dashed line representing the effective experience of G-Sync.
Zoomed in on the area of interest, you get a better view of how G-Sync and FreeSync differ. Effectively, G-Sync has no bottom window for variable refresh and produces the same result as if the display technology itself was capable going to lower refresh rates without artifacting or flickering. It is possible that in the future, as display technologies improve, the need for this kind of frame doubling algorithm will be made unnecessary, but until we find a way to reduce screen flicker at low refresh rates, NVIDIA's G-Sync VRR implementation will have the edge for this scenario.
As we discuss in the video, it is possible that AMD could implement a similar algorithm for FreeSync at the driver level, without the need for an external module. A Radeon GPU knows what frame rate it is rendering at and it could send out a duplicate frame, at a higher frame rate, to trick the display and have the same effect. It will require a great deal of cooperation between the panel vendors and AMD however, as with each new monitor release AMD would have to have a corresponding driver or profile update to go along with it. That hasn't been AMD's strong suit in the past several years though, so it would require a strong commitment from them.
It's also important to note that the experience below the VRR window on a FreeSync panel today is actually worse in practice than in theory. Because the refresh rates stays at 40 Hz when your frame rates are low, you get a combination of stutter and frame tearing (if V-Sync is off) that is worse than if the refresh rate was higher, at say 60 Hz or even 144 Hz. No doubt complications would arise from an instantaneous refresh rate shift of ~40 Hz to 144 Hz but some middle ground likely exists that FreeSync could implement to improve low-FPS experiences.
I hope you found this story (and the video!) informative and interesting, we spent a lot of time gathering the data and figuring out how best to present it. Please leave us feedback in the comments here and we will try to answer as many question as well can.
Thanks for reading!
Aging analog oscilloscopes
Aging analog oscilloscopes are the best kind.
Just like people, right? 🙂
Just like people, right? 🙂
To complement G-Sync we need
To complement G-Sync we need the (see below) feature that reduces clock speeds when frame rate go above a user selectable limit. This will greatly help in decreasing power consumption.
NVIDIA has such a feature on laptops. They really need to bring it over to desktops.
I think AND should give users
I think AND should give users an option that when frame rates drop below the free sync window, the refresh rate of the monitor jumps to max. It’s not as good as what G-Sync does but it’s a lot better than what they are doing now.
More fine work Ryan and
More fine work Ryan and Allyn, it seems as if PCPer is once again leading the tech industry’s coverage on these cutting edge hot button topics. You guys are now my go-to for asking the hard questions and getting the answers that matter from vendors.
Looking forward to more testing on input lag/ghosting if you all find the time, thanks!
Also some ideas for you all,
Also some ideas for you all, since you seem to be open to looking at this exciting new tech (VRR) at different angles. I suggested on AT but Jarred seems unwilling or incapable of performing more detailed testing.
You all hinted at this as well in your video and a lot of people seem to be dismissing the importance of a working VRR window in that 25-40FPS range. People seem to forget, that the range from 30-60FPS was kind of that Vsync no man’s land where you had to choose between double and triple buffering or hit that steep 60 to 30 FPS cliff on 60Hz panels. So naturally, VRR is immensely important here especially since minimums are going to regularly drop into this range even if someone AVERAGES 60+ FPS.
It would be interesting for you to demonstrate 2560×1440 and 4K during a staged, moderately demanding test run and map out the % of time various graphics cards spend at various refresh rates, % in each bucket (below VRR, in VRR, Above VRR). Test suite can be wide strokes here, like GTX 770/280, GTX 290/780/970), 290X/780Ti/980, Titan X) 1 or 2 in each bucket to get a wide sample. I think this would be a real eye opener for many to see, just how often they would experience the jarring drops out of VRR windows.
Also, I am not so sure AMD can just cure this problem via driver, let’s not forget they invented a spec that relies on VBlank, creative driver solutions would undoubtedly throw a wrench in all of this and may very well mean violating their own spec. The various VRR scalers may also pose a limitation here, so I guess we will have to wait and see what AMD’s response is, but I don’t think the fix will be quick, or easy, and certainly not Free. 🙂
I think you guys make a
I think you guys make a fundamental mistake in saying that FreeSync is a fault that taring happens below X fps.
Its not FreeSync fault but the maker of the video decoder chip in the monitor that dose not the same job as the G-Sync module dose.
So its properly something a new model video decoder monitor chip could fix in new monitors, as its a monitor hardware problem and not a FreeSync signal problem.
GJ on creating that refresh
GJ on creating that refresh monitoring setup. What sensor do you have attached to the monitor?
I would like if you done
I would like if you done Input lag at these low frame rates.. Doesn’t adding these extra Frames add more input lag??
Just like Triple Buffering for example..
You’re already down in the
You’re already down in the region where the next frame isn’t ready so there is no frame with newer input available. The refresh rate is adaptively timed so that when it thinks a new frame should be available, it is – just after two refresh cycles rather than 1.
You’re already down in the
You’re already down in the region where the next frame isn’t ready so there is no frame with newer input available. The refresh rate is adaptively timed so that when it thinks a new frame should be available, it is – just after two refresh cycles rather than 1.
Good analysis. If AMD
Good analysis. If AMD specify that the maximum:minimum refresh rate for Freesync monitors must be at leat 2:1, a driver solution will work just as well as G-Sync at lower cost.
Though even with the LG monitor with its 48-75Hz range, only refresh rates between 38 and 47Hz are going to be a problem. Below that the driver can frame-double; above that the monitor can accept it natively.
So even with existing monitors a driver update can fix the worst cases.
Except the drive CAN’T just
Except the drive CAN’T just double frame rates, that’s what the local buffer on the G-Sync module does, y’know, the same hardware AMD wondered why Nvidia needed.
Bottomline is Nvidia already did a lot of work and answered a lot of questions and came up with solutions with their G-Sync module.
It does not appear AMD asked and answered these same questions when haphazardly throwing together their FreeSync solution, which is why they end up with the inferior solution.
They can. All the GPU needs
They can. All the GPU needs to do is send another refresh, as if it had new frame. And with information available at driver level, they can do the prediction even better than G-Sync.
I am pretty sure nVidia will internally have the solution sooner than AMD :D.
As a downside, unlike G-Sync, it will have a small performance penalty, as gfx card memory needs to be read. Given the memory bandwidth of current GPUs, the impact would likely be pretty minor, though.
This is why i like this site
This is why i like this site over others you guys go extra mile.
Amen
Amen
As usual, amazing stuff.
As usual, amazing stuff. Making the whole PC hardware industry goes forward.
I can’t wait to see alot more
I can’t wait to see alot more updates through out the year on this topic. Things like “Driver updates, AMD 300 series GPUs, more selection of monitors”. Then we can see where the true problem can be at.
As they always say it’s much
As they always say it’s much easier said than done, knowing AMD’s track record with their rather infrequent driver releases over the last decade relative to their competitor’s. I wouldn’t get my hopes up too high for them to be suddenly releasing frequent driver updates. G-sync monitors may always cost slightly more than Freesync monitors but you get what you pay for. I think this video helps to prove that G-sync is still the superior technology if you take cost out of the equation. Great work guys on this article/video, clearly a cut above most other tech sites such as guru3d, techspot, etc.
decade!? amd released at
decade!? amd released at least one driver a month for a decade… until april 2012, while nvidia has had moments of a few months without a driver (like amd now.. in the last only 3 years)
skip the beta, they haven’t
skip the beta, they haven’t released a driver since December.
Dude, AMD releases Drivers
Dude, AMD releases Drivers every month, and thank god too. I fricking hate the Nvidia control center popping up and telling me I have an update to play some crap game with the most FPS.
“G-sync monitors may always cost slightly more than Freesync monitors”
Slightly? How about a minimum of $200. You can go and buy an AMD card with the money you save from not buying into G-Sync. It’s an added bonus that the Free-Sync monitor isn’t locked to only an AMD card as well.
AMD ditched their monthly
AMD ditched their monthly driver release ball and chain some time ago, probably for the best since it didn’t really indicate you were getting anything other than whatever was ready on Friday at 4pm PST on the last work day of the month.
There is a premium to G-Sync but as we can see from Pcpers work on it, the difference is justified because G-Sync actually does what it sets out to do, while FreeSync falls short in a number of areas. You get what you pay for.
Also, for any Nvidia user who has a Kepler 600 series or newer, they don’t have to spend anything other than the G-Sync monitor itself to experience it.
Meanwhile, most existing AMD users will have to go out and buy a new GCN1.1 capable card (R9 285, R9 290/X, bonaire) or a handful of underpowered APUs, so yeah, cost savings there for Nvidia users and an additional expense tacked on for any AMD fan who doesn’t already have one of those cards.
Can you test this on the
Can you test this on the laptop that could have gsync forced on without the module? does it get the motion blur?
Doing this will give more complete info as you willhave a comparison of what the panel is doing with and without the module, thus better telling all of what that module contributes.
We don’t have that laptop
We don't have that laptop anymore…
Ryan- can you run a test on
Ryan- can you run a test on the Frame Time for both G-Sync and Free Sync below the minimum frame threshold. Whild nvidia is doubling or quadrupling the frame, wouldn’t it also be increasing the frame time? Like it wouldn’t be a smooth experience?
The extra redraws are
The extra redraws are invisible to the game and graphics card setup, so no.
if the redraws are done on
if the redraws are done on the GPU side ,we would get less latency, right !
in other words ,would my mouse input get more responsive ?
thanks
When FPS is that low, no it
When FPS is that low, no it wouldn’t g-sync module doubles refresh rate of the monitor so input latency at 20fps on g-sync would be same as freesync.
As they stated the reason the panel does that is to prevent things like flickering and even damage to the panel.
Visual artifacts due to no
Visual artifacts due to no updating the panel often enough is one thing, but dmg the panel thats just ridiculous.
at for example 20fps, do you
at for example 20fps, do you really care about latency at that point? the framerate is too low & the high ms per frame IS latency
all it’s doing is raising the refresh rate within the sync enabled min-max scale, which is the most logical thing to do if you’re trying not to tear
Well console gamers have
Well console gamers have enjoyed 20-30 fps games for last few years so it must be somewhat tolerable 😀
As long as the display has a
As long as the display has a maximum refresh rate of at least double its minimum—which is true for the existing TN panels, but not the IPS ones—I see no reason that AMD shouldn’t be able to solve this minimum framerate problem in software.
I don’t think that they would need monitor-specific driver updates either, since the driver should already be aware of the maximum and minimum refresh rates as soon as you connect the display.
At the same time, I am not particularly concerned about the minimum refresh rates on the existing panels. Gaming below 50 FPS is still a bad experience even with G-Sync, so this seems like more of a theoretical problem to me.
And it is worth pointing out that doubling frames can in fact result in stutter if your display has anything less than 100% persistence.
It could prove to be a problem on displays which use PWM-controlled backlights for example.
A good demonstration of this would be comparing 60 FPS at 120Hz on a flicker-free monitor, and 60 FPS at 120Hz with ULMB enabled.
On the flicker-free display, 60 FPS @ 120 Hz should be perfectly smooth—though there will be a lot of motion blur due to the high persistence.
ULMB will greatly reduce the motion blur, but those repeated frames will cause awful judder and double-images to appear.
This is not the fault of ULMB though. If you display 60 FPS at 60 Hz on a strobed display, or even an old CRT, you will also eliminate the motion blur, but it won’t judder. The downside to this is that low persistence at only 60Hz will flicker a lot.
As for the overdriving issues, I suspect that is up to the panel manufacturer more than anything else. Existing electronics are probably only tuned for a single refresh rate. Updated electronics would probably do a better job with this.
But really, the main problem there is that we’re still using crappy LCD panels with 100% persistence, which all require some degree of overdriving to achieve decent response times. Even on the best displays, you get a lot of ugly artifacting when overdriving is enabled. What I’m hoping to see is the next generation of OLED displays from LG adding Adaptive-Sync support. I will be buying one of those the instant they become available if that happens.
I think that every g-syn
I think that every g-syn display is flicker free automatically. At least every single one tested so far was.
LCD`s are pretty shit
LCD`s are pretty shit really.
OLED brings the promise of a TRUE fps=hz display that can go as low as 0hz, if the fps really drops below 1fps (in load screens or smth).
As OLED is good enough to HOLD an image @ 0hz, while on LCD it would degrade instantly. As far as i know.
Then you could get a gsync display that refreshes once only for each frame. A 100% fps=hz, 100% flicker free panel. With a max cap still, but were fine with that 🙂
No offense, but most of this
No offense, but most of this post is rubbish. Have you used a G-Sync monitor long enough to back the claim “Gaming below 50 FPS is still a bad experience even with G-Sync, so this seems like more of a theoretical problem to me.”? Because every single review of the Acer 4K G-Sync panel would disagree with you since it is limited to 60Hz and most graphics cards and solutions of that time and even now were having a hard time pushing anywhere close to 60Hz capped FPS, and it was still a splendid experience from all accounts.
The MAIN benefit of G-Sync or any VRR solution *SHOULD* be its ability to reduce input lag, eliminate tearing and stutter at low FPS, and as an ROG Swift owner I can attest to the fact the G-Sync solutions do this wonderfully even at low FPS in the 20s. There is no sense of stutter, tearing or input lag, just low FPS.
In any case, Nvidia solves this low FPS problem by having low latency local memory on the G-Sync module that acts as a frame cache and lookaside buffer, which simply repeats frames as needed until the next live frame arrives. AMD may be able to do similar by allocating some local memory on their graphics card, but this will undoubtedly be higher latency than having that frame buffer local on the monitor itself.
I don’t care if this post is
I don’t care if this post is years old. I keep seeing it all over the internet. Stop this meme. VRR doesn’t make low FPS look any better. All VRR does is allows you to run vsync but without input lag and at variable instantaneous frame rates. 50FPS will look exactly as good at 50Hz vsync. The only truth behind this meme is that 50FPS at 60Hz vsync will cause frame rate to halve, and with VRR it will stay at 50. But that’s only becuase you’re misusing vsync in a situation where you can’t reach FPS=HZ.
LG OLED TV’s suffer from
LG OLED TV’s suffer from motion blur aswell. It’s a by-product of the “Sample and hold” technique thats naturally used in both LCD’s and OLED’s. There is as far as I know, no way to get around this problem today without inserting a black frame the actual pictures being displayed. And at this time, there is no solution for using that combined with either G-sync or Free-sync. It’s just to complicated to make that work when the frame rate varies all the time. The screen would constantly shift in brightness if you did.
The only site the made AMD
The only site the made AMD fix it’s crossfire problem by demonstrating the problem. PC Per has done it again showing the problem with high FPS G-sync and low FPS Freesync.
Keep on keeping them honest bros!
yup, and the only kinda
yup, and the only kinda thanks AMD fans will give them, verbal attacks over it.
Yup…. sad state of affairs.
Yup…. sad state of affairs.
Haha yes it makes you wonder,
Haha yes it makes you wonder, it is as if AMD fans DON’T want AMD to fix these problems and produce better solutions for them.
If it takes “hostile” press like Pcper and some Nvidia fans to force AMD to fix their product so be it, but at least it will keep AMD fans who claim their solutions just as good honest in the meantime.
In the video you state that
In the video you state that AMD could try to replicate a similar behaviour in the drivers on the lower gap. The problem with that is if you send the repeated frame at the same time you finished the next frame you will introduce stutter because it will have to wait for the frame to go completely to the monitor to free that framebuffer and start the next frame. The only way to fix that is by using triple buffering but that will increase the memory usage which is not good.
If that precise thing
If that precise thing happens, you're right, you do have to wait for that redraw on the screen to finish first. But, if you are smart about implementation, you should be able to predict to a reasonable degree when the next frame will finish. A good algorithm will never see more than 30-40% delay compared to current frame time. That's a lot but should be very rare. GPUs are very good and knowing how long it will be until the next frame is ready.
hum… It didn’t occur to me
hum… It didn’t occur to me that AMD could do such prediction. Still it would be tricky. i.e. you predicted that you could finish a frame about the same time the next image goes up, if you miss that time window you’ll have stutter. In G-Sync that will never be a problem since the previous frame is stored in the module buffer so I tend to believe that free-sync will never be as good as G-Sync even with major driver tweaks imho.
So would the previous frame
So would the previous frame be stored in the graphics card’s VRAM in the case of AMD doing it over software.
In Nvidia’s case, two components along the display chain have memory that is able to fit and retain the previous frame: VRAM and the G-Sync module RAM. The G-Sync module does more than just retain the last frame; however, why would Nvidia pay to include RAM on the module instead of storing that frame in VRAM?
Plus, the minimum refresh rate below which the graphics driver would take care of multiple refreshes per frame could be a certain margin above the minimum refresh rate reported by the display, as long as the first minimum is still less than double the maximum refresh rate. I don’t see why AMD would have to implement a different solution for every single monitor that gets released with Adaptive Sync support (and FreeSync certification) if most or all monitors would exhibit the same issue i.e. when going below the minimum reported refresh rate + a safety margin.
What exactly do thing
What exactly do thing Nvidia’s gsync module does? It has to take a guess and decide whether to start a redraw from the on-board buffer, or wait for the next frame to be delivered. It could suffer from the same issues if the next frame is earlier than expected; it could be in the middle of a refresh which cannot be interrupted without tearing. This isn’t a big issue because the refresh doesn’t take very long for something like a 144 Hz panel.
No, the G-Sync doesn’t need
No, the G-Sync doesn’t need to guess because it is taking commands directly from the GPU and letting it know when its next refresh is available while repeating frames when a new refresh is not available. This was all covered years ago when G-Sync first launched, the bi-directional communication AMD claimed they did not need and did not understand why Nvidia needed to use an expensive FPGA module for.
I guess now we have a better understanding why Nvidia chose the route they did! 🙂
I’m guessing two other
I’m guessing two other solutions would be:
1) The graphics driver refrains from rendering the next frames after such framerate dips in faster than 1 / x frametime where x is slightly higher than the current framerate. That is, render the frame, then hold if it has finished early. Following frames will be rendered with a shifted timeframe, and object positioning would certainly be more accurate than with a sharp frametime reduction (think Sleeping Dogs rolling average). It’s certainly better than allowing stutter to occur, and replacing that with a framerate ramp. Given variable refresh rate technology allows highly variable FPS to be kind of smoothed out, the ramp doesn’t have to be that slow.
2) Same as 1, but the game takes care of it. The game engine has to be aware that it’s running on a variable refresh rate monitor via driver reporting such a feature as enabled or disabled.
G-sync module implements
G-sync module implements t-buffer by hardware, so another reason to understand the hardware solution of nvidia, not only better low fps. Better responsiveness.
If I got them right, though,
If I got them right, though, it IS essentially triple buffering. Gay-Sync module has build-in memory that’s being used for this particular type of caching only, it looks like. How much, I don’t know, but it looks like it’s still enough to at least perform a full-blown triple buffering job.
As for FreeSync…there’s no physical module, so it really boils down only to three roads they could take: use cache from system memory, use cache from video card’s memory, and…hope some monitor manufacturer puts more cache in their panels by default. :
Hi!
So far that we have an
Hi!
So far that we have an conclusion, from driver maybe able to fix this issue AMD. Well i say maybe because so many other therms come up at the same time. Brightens and so on. The monitor manufactures have to invest more to “step out” from this shoe, not just the AMD driver. Sounds simple to fix the whole issue but it is more complicated at the other end.
For AMD to implement a
For AMD to implement a similar technology, the panel would need to have a relatively wide range of usable refresh rates. The panel would need to be able to (at least) double its minimum refresh rate.
There are panels released already that have too narrow of a range for this to work.
For example, the LG that Linus just reviewed (34UM67) has a refresh window of 48-75 Hz. If it dips below 48 Hz, it would have to be able to double 47 Hz (94 Hz), which it can’t.
I wonder why the LG has such a high minimum refresh rate…
Also don’t forget guys. What
Also don’t forget guys. What else it could lead to fix some of the issues.
AMD mentioned that Freesync
AMD mentioned that Freesync will also support video playback and power saving features that was originally built for laptops. Would playing videos at lets say 24fps cause it to also judder and tear if the refresh window is 40 to 144 hz?
That is correct. If anyone
That is correct. If anyone was to write a full screen video player app that behaved like a game engine and drew at the native video FPS (24), all FreeSync panels would tear or judder while G-Sync panels would simply frame double and remain at an effective 24 FPS, without tearing or judder.
What about working in
What about working in different resolutions.
Can either one work outside full screen mode?
Do you always have to be in monitor native resolution for it to work ?
Outside of full screen gaming
Outside of full screen gaming modes, the display reverts to the Windows setting.
Which display? The gsync,
Which display? The gsync, freesync or both ?
What about outside native resolution ?
Both displays would revert to
Both displays would revert to native windows refresh rate. As for resolution that shouldn’t matter as long as game is full screen without border.
Do either Gsync or FreeSync
Do either Gsync or FreeSync displays work in full screen mode in resolutions other than the native panel resolution?
Fixed refresh source material
Fixed refresh source material shouldn’t be a big issue if the coder was aware of the limitations. ie. the player could just double the frames and present a 48fps video effectively doing what gsync is doing in software. A 120hz panel should be able to play 24fps material as intended by replaying the same image 5 times. However variable source material like games is a different story since you need to predict the time to the next frame.
If a player was doubling
If a player was doubling frames the LG 34um67’s VR minimum of 48 fps would enable smooplayback I would think.
Did you guys contact Benq for
Did you guys contact Benq for the specs?
In the video you said they are the same but as people mentioned in the last freesync article in the comments they aren’t. Googling the specs for the models shows they aren’t.
Curious as to why Ryan and Allyn continue to say they are the same while monitor sites like TFTCentral have them listed as different models.
The resolution, max refresh,
The resolution, max refresh, panel type (TN), grey to grey response, are the same for both panels. They may not be the exact same part number, but for the purposes of comparison, they are close enough for us to consider them 'the same panel'.
That’s an issue because if
That’s an issue because if you frequent monitor sites advertised 1ms GtG never translate to actual 1ms RT. Asus Swift has a 6.9 gtg rt w/ od off and 2.4 gtg rt w/ od extreme w/ RTCO 10.5%
How can you say they are close enough without even testing such differences in the panels to begin with?
Good work guys!
I actually
Good work guys!
I actually think that the ghosting thing is a bigger issue than the below VRR thing.
Even though the animations will be a lot smoother at 35 fps at them G-syncmontors, I still wouldn’t want to be that low when playing a game. I Just lower a setting or two and keep them fps above the minimum refresh rate. Hey, for that extra dollars I save I could spend on a faster graphics card 🙂
It would be interesting to see if that new Asus 144 hz ips-freesync monitor has the same ghosting problem as that BenQ you tested?
If you pause the video in the
If you pause the video in the previous FreeSync article in random spots you will see the ROG Swift ghosting also. Or save the video and frame by frame advance through it and you will see the ROG Swift ghosting when the LG and BenQ aren’t.
The exact opposite of what you see in the frame grab PCPer used.
This so called ghosting issue is a little blown out of proportion IMO.
They all do it.
That’s not ghosting. That’s
That's not ghosting. That's the high speed video catching (within a single captured video frame exposure) the actual scan of the Swift. Once the scan is done, there is only one blade, therefore no ghosting.
What if desktop displays
What if desktop displays start to support panel self refresh? It makes sense for saving power in mobile devices, but perhaps it can provide another approach to smoothing out sub-optimal frame rates.
The idea of just repeating the last frame when a new one isn’t ready seems obvious and simple. Panel self refresh could perhaps allow for the same sort of frame multiplication when frame rates drop below the true VRR window. The panel control logic should already know how to manage overdrive or ULMB parameters to keep brightness/flicker and ghosting in check. At some point, though it seems like eventually this will be effectively integrating all the functionality of a G-Sync module into the display control logic.
*EDIT* After thinking a bit more, I wonder why would a panel with up to 144Hz refresh rates which also has some sort of frame rate multiplication capability ever bother running at frame rates significantly below 1/2 the max refresh rate? For example 24 FPS could be doubled and shown at 48Hz to keep in the supported range. But it could just as easily be shown at 72Hz, 96Hz, 120Hz, or even 144Hz refresh rates. Shouldn’t a higher effective refresh rate help reduce things like ghosting and flicker? I understand that it is important to avoid relatively large deltas in refresh rates, but there seems to be plenty of room in a 40-144Hz range for more aggressive frame multiplication.
Interpolate
That is what
Interpolate
That is what G-Sync is doing below 40fps. Just like a TVs would do motion interpolating.
I suspect that’s why people in forums and reviews notice a stutter and a jitter what TVs viewers call the soap opera effect.
Tom Petersen calls it
Tom Petersen calls it “Look-a-side” instead of motion or frame interpolation
https://www.youtube.com/watch?feature=player_embedded&v=Z_FzXxGVNi4#t=1481
I sincerely hope there is no
I sincerely hope there is no frame interpolation or motion estimation going on. I despise frame interpolation and turn it off on my TVs because of the soap opera effect. I have noticed it almost immediately on any TV that has this feature.
Frame multiplication is NOT the same as frame interpolation. Interpolation is injecting artificially created frames where none existed in the original content. To many people this is an unnatural artifact that creates an undesirable experience. Frame multiplication on the other hand only displays the actual frames of the content but repeats those frames to avoid judder from things like 3:2 pull down when frame rate and refresh rates are not in sync. For example, a 120Hz TV can repeat each frame of a 24FPS movie 5 times to match the native refresh rate of the TV without “creating” any new frames. Similarly, 30FPS and 60FPS content maps evenly by quadrupling and doubling frames respectively.
There is no interpolation. It
There is no interpolation. It redraws the exact same frame.
Which one is better upgrade
Which one is better upgrade for geforce 660 owner gsync monitor or geforce 980?
GSYNC
GSYNC
Everyone talks about amd not
Everyone talks about amd not having a driver update since omega, but fail to mention they release several beta drivers between whql releases. Whql means nothing and I’ve had just as many problems with signed drivers as I have beta drivers from both amd and nvidia, which is very few. Also, readers are constantly reminded of amd’s failure to provide a woworking crossfire driver for far cry 4, yet it’s ubisoft that was responsible for broken crossfire profiles. Amd stated that they had disabled crossfire in far cry 4 due to issues on ubisofts end and would enable the profile when they fixed the issue, ubisoft never once argued against that claim. Go figure, a gameworks title with 6 patches containing fixes for nvidia sli and not a single fix for crossfire. That is a story pcper needs to be digging into, I would read that article. To be fair I don’t know that pcper has ever brought this specific issue up or not, but it has beened thrown around by every other tech site I know of. Pcper also gets their fair share of cheap shots in about drivers though.
let’s talk 2014
for the whole
let’s talk 2014
for the whole year, AMD only had 3 WHQL driver updates: 14.4, 14.9 and 14.12
yes, between those, they were BETA drivers and RCs, but not even one per month to cover the whole year – and the problem with those is that they broke some stuff or weren’t generally stable
now, imagine manufacturers release just 10 monitors in a month; AMD, on top of optimizing for specific game(s) they also need to add those monitors in
AFAIK AMD have no driver
AFAIK AMD have no driver release between Omega (14.12) and current 15.13 beta. before 15.3 comes out AMD have no CF profile for games that come out in 2015. not just FC4.
In 2015 this is first driver
In 2015 this is first driver since Omegas from end of 2014.