We have talked about G-Sync for what seems like years now and we got our first hands-on with AMD's FreeSync monitors earlier this week at CES, but the new ASUS MG279Q is in an interesting place: it is the first display that publicly supports Adaptive Sync and DP 1.2a+ but does not have an affiliation with either branded variable refresh rate technology. As it turns out though, that isn't bad news.
First, let's talk about the hardware. The screen is a 27-in 2560×1440 display with IPS panel technology and a maximum refresh rate of 120 Hz. High refresh rate IPS monitors are brand new and we are glad to see that ASUS is bringing one to the market so we can finally combine great color, great viewing angles and great refresh rates. The monitor supports DP 1.2a+ and Adaptive Sync which leads us too…
…the fact that this monitor will work with AMD Radeon graphics cards and operate at a variable refresh rate. After talking with AMD's Robert Hallock at the show, he confirmed that AMD will not have a whitelist/blacklist policy for FreeSync displays and that as long as a monitor adheres to the standards of DP 1.2a+ then they will operate in the variable refresh rate window as defined by the display's EDID.
So, as described by the ASUS reps on hand, this panel will have a minimum refresh of around 40 Hz and a maximum of 120 Hz, leaving a sizeable window for variable refresh to work it's magic.
Even better? The price! ASUS said this panel will ship in late Q1 of this year for just $599!
PC Perspective's CES 2015 coverage is sponsored by Logitech.
Follow all of our coverage of the show at https://pcper.com/ces
Asus is probably afraid to
Asus is probably afraid to call it FreeSync enabled or get it certified because Nvidia might hike up GPU/Gsync module prices on them.
Awesome monitor–checks all
Awesome monitor–checks all the right boxes for a freesync monitor– for a reasonable price. Thanks Asus.
I hope LG, Samsung, BenQ make
I hope LG, Samsung, BenQ make a similar one and knock the price below 500$.
Good time ahead.
so this one also have minimum
so this one also have minimum refresh rate of 40hz for freesync to work? isn’t that FreeSync can go as low as 9hz?
FreeSync can support refresh
FreeSync can support refresh rates as low as 9Hz, but clearly this LCD panel cannot. Expect this to be the norm. You will get various panels with various abilities. For example, FreeSync supports up to 240Hz, but there are no panels on sale that can achieve it yet.
I believe the minimum of the
I believe the minimum of the specification is 24hz. But it depends on the monitor (panel type, electronics, etc) how low it can actually go.
I wish they make a screen
I wish they make a screen that will be 24-x hz cause arma 3 likes to dip in the high 20s sometimes
Quick answer, AMD has stated,
Quick answer, AMD has stated, FOR NOW, the maximum and minimal variable refresh rates will be determined by the monitor, not by the standard. outside of those limits it’s the same old vsync on or off question
When you drop the refreshrate
When you drop the refreshrate below 30Hz (on smome panels already in the higher thirties) the display starts flickering because for each refresh the backlight is turned off and on again. When doing it so slow your eye can actually see it as a distorting “buzz” on the screen. that can easily give you a headache and is way more problematic than tearing. So the decided to rather have tearing than flickering. 😉
Can I use the variable
Can I use the variable refresh with my Nvidia GTX 980 video card ?
The card does support adaptive sync and has a display port.
The answer as to if NVidia
The answer as to if NVidia will support the Adaptive Sync standard that has been added to DP1.2a has been a definitive no.
If the market
If the market “standardizes” around DP Adaptive Sync then G-Sync becomes yet another compelling, but proprietary niche technology like LightBoost, 3D vision, PhysX, CUDA, etc. Up to this point, nVidia has invested a considerable amount of effort to create an exclusive G-Sync ecosystem and they are going to keep on riding it (and cashing in) until they feel the need to publicly acknowledge and support DP Adaptive Sync. That being said, I would be shocked if nVidia doesn’t already have a plan in place for incorporating DP 1.2a+ support when they feel the time is right.
Asus appears to have been careful not to advertise DP Adaptive Sync as a feature, but instead had to be asked directly. Is it going too far to guess that there are nVidia exclusivity agreements with the likes of Asus which may prevent them from openly marketing and endorsing AMD FreeSync for now? Since DP 1.2a+ is a VESA standard, then this spec can be quietly included, just without specific mention of a competing variable refresh technology. This could be why they are conspicuously absent from the official list of vendors with “FreeSync certified” displays.
No, it does not support
No, it does not support adaptive-sync. That was added to the DisplayPort specification in DP 1.2a. The GTX 980 only has DP 1.2. NVIDIA will also have to support it with their software even if they produce cards with DP 1.2a ports.
but to my knowledge adaptive
but to my knowledge adaptive since is optional to the spec. nvidia can use DP1.3 in the future but still leaving the adaptive sync aspect if they want to. i think it was the same with monitor. also why FreeSync need DP1.2a? when they first demo the concept on laptop there is no DP1.2a exist. why FreeSync cannot work on DP1.2?
Laptops use eDP not DP
Laptops use eDP not DP standards. eDP standards have more features then current DP standards have.
The laptops AMD
The laptops AMD showcased were’t running over DisplayPort, they were using their built-in displays, which are connected via an internal connection called eDP (embedded DisplayPort).
Adaptive-Sync is an optional feature of eDP
Adaptive-Sync is an optional feature of DP 1.2a
Adaptive-Sync is NOT an optional feature of DP 1.2
NVIDIA cards only support up to DP 1.2
So, NVIDIA cards like the GTX 980 do NOT support Adaptive-Sync.
If NVIDIA uses DisplayPort 1.3 in the future then good for them, the GTX 980 will still be using DP 1.2 and will still not support Adaptive-Sync.
i know they were using eDP.
i know they were using eDP. but the feature was not called as adaptive sync. initially it is something that build to save power and not so much about preventing ‘screen tearing and reduce input latency’. if that specific feature indeed called as Adaptive Sync i believe VESA will mention about bringing in Adaptive Sync feature that only availble in eDP to DP. but no the spec of DP1.2a was proposed by AMD and was called as Adaptive Sync later by VESA.
and when AMD demo that laptop they said they have the necessary hardware inside their GPU for 3 generations already but as it turns out when more info comes out only cards with GCN 1.1 capable of running Adaptive Sync in games.
i’m very well aware 900 series only limited to DP1.2. even if nvidia adapting DP1.3 later in the future it is still not guaranteed them to support Adaptive Sync since it is optional to the spec. and for their part they already mention they have no plan on supporting Adaptive Sync and will further improve GSync. just like AMD commitment on to continue development of Mantle even if DX12 become the norm
Nvidia cards come with DP
Nvidia cards come with DP 1.2, NOT 1.2a, because Nvidia wants to be ABSOLUTELY SURE that, if you want something like Adaptive Sync, you WILL PAY THEM AGAIN, by buying a G-Sync monitor that costs $150-$200 more.
with gamenub instructions it
with gamenub instructions it will be possible I think
Wait, can someone confirm my
Wait, can someone confirm my excitement?
variable refresh 40hz to 120hz
??? is this real? is there something horrible im missing?
I don’t think so, other than
I don’t think so, other than it only being a dollar under $600 before tax or shipping if those apply.
As the hype of this display
As the hype of this display increases I believe the release price will go up as well.
If all I have to do is
If all I have to do is abandon NVidia to gain access to this product’s features (and price point) so be it. I might consider an NVidia upmarked product if LightBoost & G-Sync & 3D vision all worked at once (not that I really think it could anyways), but short of that the extra cost doesn’t seem worth it.
It’s been 15 years since I
It’s been 15 years since I have had an ATI card. The experience was so horrible that I vowed to never buy one again. I have been NVIDIA since then. I just might pick up a 290x and one of these monitors and be a convert. I have never been so tempted. Variable refresh rates have been way, way too long in coming. I was once again disappointed to see NVIDIA developing a closed system. I was again disappointed by only TN panels in the variable refresh rate stable. IPS and variable refresh rate are finally here.
Even outside of the
Even outside of the Freesync.. and IPS with 120hz refresh rate.. NICE
We need Intel to embrace DP
We need Intel to embrace DP 1.2a than it is done deal, Adaptive sync will be everywhere.
AMD will have it on GPUs and CPUs, Intel on CPUs, hopefully Nvidia Will jump in that point and we all benefit.
Meh…Ultra wide 34 inch
Meh…Ultra wide 34 inch 1440p 120hz with free sync…Until then I will keep my 24 inch 1080p.
But IPS with 120hz is pretty nice.
When Intel going to support
When Intel going to support adaptive sync?
oh wow i just got that intel
oh wow i just got that intel is the only one not supporting adaptive sync, hell even AMD apus are supporting it, maybe amd will let them use their standard
Not too long ago Intel
Not too long ago Intel approached AMD about leveraging FreeSync. Not sure if those negotiations are going well or not, but they’re at least showing interest. Hopefully Nvidia will at least pull their heads out of their asses and at least allow people to use whatever technology they want on their nvidia cards. Seriously, they could support G-Sync and FreeSync and have even more marketing copy to put on their boxes.
I think this will have to be
I think this will have to be my next monitor. Hopefully the price will drop a bit, so I won’t have to bend over and touch my toes when I click the “buy” button.
If Gsync was better than
If Gsync was better than fsync, am pretty sure there would be leaks showing that already from nvidia and vice versa. Maybe they are the same, or maybe no one have tested yet… maybe