The Waiting Game
The first retail ready NVIDIA G-Sync monitor is finally reviewed, the ASUS PG278Q ROG Swift.
NVIDIA G-Sync was announced at a media event held in Montreal way back in October, and promised to revolutionize the way the display and graphics card worked together to present images on the screen. It was designed to remove hitching, stutter, and tearing — almost completely. Since that fateful day in October of 2013, we have been waiting. Patiently waiting. We were waiting for NVIDIA and its partners to actually release a monitor that utilizes the technology and that can, you know, be purchased.
In December of 2013 we took a look at the ASUS VG248QE monitor, the display for which NVIDIA released a mod kit to allow users that already had this monitor to upgrade to G-Sync compatibility. It worked, and I even came away impressed. I noted in my conclusion that, “there isn't a single doubt that I want a G-Sync monitor on my desk” and, “my short time with the NVIDIA G-Sync prototype display has been truly impressive…”. That was nearly 7 months ago and I don’t think anyone at that time really believed it would be THIS LONG before the real monitors began to show in the hands of gamers around the world.
Since NVIDIA’s October announcement, AMD has been on a marketing path with a technology they call “FreeSync” that claims to be a cheaper, standards-based alternative to NVIDIA G-Sync. They first previewed the idea of FreeSync on a notebook device during CES in January and then showed off a prototype monitor in June during Computex. Even more recently, AMD has posted a public FAQ that gives more details on the FreeSync technology and how it differs from NVIDIA’s creation; it has raised something of a stir with its claims on performance and cost advantages.
That doesn’t change the product that we are reviewing today of course. The ASUS ROG Swift PG278Q 27-in WQHD display with a 144 Hz refresh rate is truly an awesome monitor. What did change is the landscape, from NVIDIA's original announcement until now.
What is G-Sync? A Quick Refresher
Last year, I spent a lot of time learning about the technology behind NVIDIA G-Sync and even spoke with several game developers in the build up to the announcement about its potential impact on PC gaming. I wrote an article that looked at the historical background of refresh rates and how they were tied to archaic standards that are no longer needed in the world of LCDs, entitled: NVIDIA G-Sync: Death of the Refresh Rate. We also have a very in-depth interview with NVIDIA’s Tom Petersen that goes through the technology in an easy to understand step by step method that I would encourage readers watch for background on the game-changing feature in this display.
The idea of G-Sync is pretty easy to understand, though the implementation method can get a bit more hairy. G-Sync introduces a variable refresh rate to a monitor, allowing the display to refresh at wide range of rates rather than at fixed intervals. More importantly, rather than the monitor dictating what rate this refresh occurs at to the PC, the graphics now tells the monitor when to refresh in a properly configured G-Sync setup. This allows a monitor to match the refresh rate of the screen to the draw rate of the game being played (frames per second) and that simple change drastically improves the gaming experience for several reasons.
Gamers today are likely to be very familiar with V-Sync, short for vertical sync, which is an option in your graphics card’s control panel and in your game options menu. When enabled, it forces the monitor to draw a new image on the screen at a fixed interval. In theory, this would work well and the image is presented to the gamer without artifacts. The problem is that games that are played and rendered in real time rarely fall into a very specific frame rate. With only a couple of exceptions, games frame rates will fluctuate based on the activity happening on the screen: a rush of enemies, a changed camera angle, an explosion or falling building. Instantaneous frame rates can vary drastically, from 30, to 60, to 90, and force the image to be displayed only at set fractions of the monitor's refresh rate, which causes problems.
If a frame takes more than the standard refresh time of a monitor to draw (if the frame rate drops low) then you will see a stutter, or hitch, in the game play that is caused by the display having to re-draw the same frame for a second consecutive interval. Any movement that was tracking on the screen would suddenly appear to stop – and then would quickly “jump” to the next location faster than your mind thought it should. V-Sync also inherently introduces input latency to games if these lag and stutters take place.
The common alternative for gamers worried about latency and stutter was to disable V-Sync in the control panel or in game. This solves the stutter and latency issues (kind of) but cause another, much more noticeable issue, called "tearing". With V-Sync disabled, the graphics card sends a new frame to the monitor at any point in the monitors refresh cycle, even if the LCD is currently drawing. The result is that, at some point down the screen, the user sees the previous frame above, and the current frame below, at a sharp line. You will literally be seeing an image where the geometry is no longer lined up and, depending on the game and scene, can be incredibly distracting.
Monitors with refresh rates higher than 60 Hz reduce this tearing by having more frequent screen refreshes, and thus a tear is less likely to occur in any single refresh cycle, but the tearing is impossible to remove completely.
NVIDIA G-Sync switches things up by only having the monitor refresh its screen when a new frame is ready from the GPU. As soon as the next frame is drawn it can be based to the display and drawn on the screen without tearing. If the next frame is ready in 16ms, it can be sent immediately. If it takes 25ms or only 10ms, it doesn’t matter, the monitor will wait for information from the GPU to draw the new frame. The result is an incredibly smooth and fluid animation that doesn’t stutter and doesn’t tear.
There are a couple of fringe cases that NVIDIA has needed to build for, including frame times below 33ms (under 30 FPS), where the image on the panel might visibly darken or decay if it isn’t refreshed automatically, even if a new frame isn’t ready. Also, some games have issues with G-Sync (Diablo III, for example, doesn’t have a true full screen mode) and have to be disabled, either through a profile or manually to avoid artifacts.
Ultra Low Motion Blur Technology
Another feature present on the ASUS PG278Q monitor is ULMB, or Ultra Low Motion Blur. Original built as part of the NVIDIA 3D Vision infrastructure, ULMB is a technology that is used to decrease motion blur on the screen and remove or reduce ghosting of fast moving images. It does this by turning on the backlight in time with the screen refresh and then quickly darkening the backlight after the pixel has been “strobed”. The effect is that, with ULMB enabled, images are sharper and appear to have less motion blur from frame to frame.
This sounds great! But the side effect is a much lower total brightness perceived by the gamer on screen. Just as we saw with 3D Vision throughout its development, enabling this mode effectively drops the light output of the screen by half. For some gamers and in some situations, this trade off will be worth it. Particular games like RTS, that include lots of small text of units that scroll across the scene very quickly, can see dramatic sharpness increases.
It's difficult to capture with stills, but animations are darker, but shaper, with ULMB
It’s important to note that ULMB can only be used when G-Sync is not enabled and it only works at 85 Hz, 100 Hz, and 120 Hz. Most games, at least in my experiences thus far, will see much more benefit from the variable refresh rate technology of G-Sync than they will with ULMB. If brightness is a concern (like playing in a well lit room) then ULMB could be a non-starter as the halved light output will be very noticeable.
Enabling ULMB is as easy as navigating the monitor menu and selecting it and you’ll be able to adjust the strobe pulse width. I tested the capability through fantastic website called testufo.com that offers users a host of options to test the motion blur of their displays. It was easy to find instances in which the ULMB feature allowed for sharper animations but the brightness variance was also very apparent.
Asus VG248QE + g-sync kit is
Asus VG248QE + g-sync kit is ~480$ if you do it yourself. A site by name of overlordcomputer is selling it pre installed for 500$. For people that are not feeling up to doing it themself.
I bought my Asus VG248QE +
I bought my Asus VG248QE + g-sync from digital storm for 499 installed.
Done~!
It’s also worth mentioning
It’s also worth mentioning that that is a 24″ 1080p display.
Asus VG248QE is a 24′ 1080
Asus VG248QE is a 24′ 1080 display, and you are suggesting people spend $500 on it?
question on resolution , how
question on resolution , how does desktop look at say 1080 if you didn’t want to scale windows settings to 125% because some apps not like that .
I see many times comments anything but native gives blurry text .
I have old 2007FP Dell 1600×1200 and run a custom res and don’t really notice any degrade .
I’ll just quote Allyn
I’ll just quote Allyn Malventano
Now for people saying that it ‘feels smoother’, with no actual frame capture to back it up, makes it a bit of a bogus claim, IMO.
It’s interesting to follow
It's interesting to follow you around…
http://s3.amazonaws.com/pcper-pics/63543350199.png
Thanks for quoting me, but I
Thanks for quoting me, but I was talking about people saying one card 'felt smoother' than another card, where frame pacing had been fixed on both cards. My point was, how could people 'feel' that particular difference when the actual pacing difference between both platforms is negligible.
…now in this context, pretty much anyone you sit down in front of a GSYNC display is definitely going to see the difference. It pretty much smacks you in the face, especially at low FPS.
Thanks you for bringing the
Thanks you for bringing the truth, and clearing up the usual misunderstanding. I agree GSync is hard to miss.
Hate the “gamer” look but
Hate the “gamer” look but love the specs.
https://www.youtube.com/watch
https://www.youtube.com/watch?v=XdqTIfNv2DE
BAM! suck it “free”sync!
Why didn’t this get an
Why didn’t this get an “Editor’s Choice” award instead of a gold award? What’s the difference between those two awards anyway? lol.
My guess would be the price.
My guess would be the price. Very expensive for a TN panel, albeit a nice one.
I checked my go-to stores here, they have it in stock or on order. But they ask more than 1000 USD (7500 SEK including taxes). I think I’ll keep my cheap Korean 1440p Samsung PLS that does 100Hz.
Editor’s Choice would be
Editor's Choice would be higher than Gold, yes. And you had it right – the exceptionally high price is what led me to not dive 100% in with this product. And also we have a 4K G-Sync option coming from Acer and a 1080p AOC model coming soon as well.
please don’t tell me that 4k
please don’t tell me that 4k gsync monitor will be a 27-28″ TN. 🙁
When will we see a 32″ or larger gsync non-TN 4k monitor?
Heh, and I would love 24″.
Heh, and I would love 24″. (25″+ wouldn’t fit space)
…
…
It’s a pitty, in eyefinity,
It’s a pitty, in eyefinity, the best cards to run those 3 monitors don’t support the one feature that makes this monitor special.
does Ultra Low Motion Blur
does Ultra Low Motion Blur work with Intel/AMD graphics card?
ULMB still has workarounds
ULMB still has workarounds for the VG248QE that do work, but ULMB on something like the ROG Swift will be very difficult to crack because it’s a custom scaler designed by Nvidia to drive the monitor. Even Nvidia has been paying attention to how people are using ULMB, disabling it for GSync.
If you’d like any indication of how much Nvidia hates supporting ULMB, they appear to have gotten ASUS to disable it entirely for any Radeon GPUs, even though the option is in the monitor’s OSD menu and not part of GSync at all.
this really sucks, they are
this really sucks, they are already making money with the gsync module thing inside the monitor, and it shouldn’t require any work from forceware or geforce;
personally 120-144Hz and ULMB sounds a lot more exciting than Gsync.
I play games with no vsync and don’t care much about tearing, but LCD motion blur is horrible
also being a vendor specific feature just kills it for me;
You’re a total fucken
You’re a total fucken retard…. please find out how variable refresh rate works…kook
also, can you disable the
also, can you disable the annoying red light thing?
yes, you can. 4th post in the
yes, you can. 4th post in the thread.
http://www.guru3d.com/articles-summary/asus-rog-swift-pg278q-gsync-gaming-monitor-review.html
G-Sync out of range
G-Sync out of range bug
http://rog.asus.com/forum/showthread.php?50213-Asus-PG278Q-Out-of-Range-bug
beta driver 340.43 is broken
beta driver 340.43 is broken with Gsync.
So are the 340.52 WHQLs.
Beta
So are the 340.52 WHQLs.
Beta or WHQL the bug is there. Not very comforting given how long G-Sync has been in development to still have a bug like this persist.
Odd, I used BOTH of those
Odd, I used BOTH of those drivers in my testing.
You also said the 30fps dip
You also said the 30fps dip is gone. Linus and others noted its still there in their reviews.
I believe the dip Linus spoke
I believe the dip Linus spoke of is not really a dip (or gap). When we made that second chart, (showing the gap / judder is gone), I made sure I was watching it run. The thing is that when FPS drops down in the 20's, the delay between frames really becomes noticable. It's really more a matter of your brain no longer seeing motion and more individual frames. The transitions into and out of such low FPS figures was smooth, even though watching video at such a low frame rate is jarring to the eye.
negative,
I had no issue with
negative,
I had no issue with the next driver only the that beta.
Also I’m not using the swift I’m talking about the ASUS VG248QE with Gsync module.
This has been fixed and
This has been fixed and should appear in our next driver release.
dies g-sync work smoothly
dies g-sync work smoothly with SLI setups?
yes it would, linus in his
yes it would, linus in his review was using SLI’ed titan’s
Ive also test the ASUS
Ive also test the ASUS VG248QE Gync on my SLI rig and it works great!!
Ive also test the ASUS
Ive also test the ASUS VG248QE Gync on my SLI rig and it works great!!
No, it doesnt.
I mean, it
No, it doesnt.
I mean, it does for the majority of games. But even after that, SLI by nature adds more input lag into the mix and will ALWAYS add SOME frame pacing issues (nvidia even is WAY ahead of AMD on this one, but still).
If your goal is gsync and MAX possible smoothness then more than 1 video card IS a mistake. Since its counter active to that goal.
You have no idea what your
You have no idea what your talking about….almost every online review and video showcasing this monitor has used SLI and all have said it runs smoothly. Which is the whole purpose of G-sync..*sigh* people and their(F*ck Logic!) statements…
You have no idea what your
You have no idea what your talking about….almost every online review and video showcasing this monitor has used SLI and all have said it runs smoothly. Which is the whole purpose of G-sync..*sigh* people and their(F*ck Logic!) statements…
You have no idea what your
You have no idea what your talking about….almost every online review and video showcasing this monitor has used SLI and all have said it runs smoothly. Which is the whole purpose of G-sync..*sigh* people and their(F*ck Logic!) statements…
I’d be more interested in
I’d be more interested in this if they offered a stand-less option. I use a triple monitor setup and, as long as it has VESA, I would happily forgo the fancy lights and adjustments for a price reduction.
it does have a VESA mount for
it does have a VESA mount for you. stand is option if you want to use it or not
Read “as long as it has VESA”
Read “as long as it has VESA” as a statement about monitors in general and “I would happily forgo the fancy lights and adjustments for a price reduction” as emphasis on price reduction. I have this prejudice against paying for things I know I’m not going to use.
Despite its over 1000$ MSRP
Despite its over 1000$ MSRP here in Norway it sold out faster than free handjobs.
Cant even describe how happy I am with this monitor. Did not expect G-Sync to be this good while running 2x GTX Titan. Its giving my other two monitors an inferiority complex.
G-sync is great but this
G-sync is great but this monitor is ripoff
There is also this
There is also this problem.
https://www.youtube.com/watch?v=-8S2KkJFFMY&feature=youtu.be
http://rog.asus.com/forum/showthread.php?50004-PG278Q-vertical-stripes-esp.-in-3D-mode.&country=&status=
Holy cow, get this reviewer a
Holy cow, get this reviewer a glass of water! Hearing him struggle to swallow was enough to make me register to make a comment! Guys, we’re totally OK with you having a drink of water while doing a review!
This review really makes me want to do the GSync mod on my VG248QE.
I think that once this sync
I think that once this sync tech has been out for quite awhile and prices drop a lot(for both versions of it), then I might consider upgrading my monitor and gpu. But in the mean time, my current setup provides a very smooth gaming experience.
yea yea
I`m using nVidia 2d
yea yea
I`m using nVidia 2d surround and “it’s going to be basically impossible to go back.“
Now there is G-Sync.
Does G-Sync support triple monitor setups?
Bad bad RED PCPer
ATM i believe g-sync is
ATM i believe g-sync is limited to 1 monitor, this is like due to the fact that most video cards only have 1 display port on them. Which in the future could change.
At $600 this would be an
At $600 this would be an instabuy. But that price tag is just too much. The question is really going to be once the supply has settled down, what discounts do we see, if any?
I’ve got the Asus VG278HE
I’ve got the Asus VG278HE 144Hz 1080p strobing monitor and this is the sort of monitor I want to upgrade it to.
It’s not enough though. I’m gonna wait until we see a 2160p version of this Asus ROG monitor that still works at 144Hz and has G-Sync. I will then upgrade my graphics card to maintain a solid 120/144Hz framerate. It might take up to two years for the monitor and NVIDIA graphics cards to become available, but I’m in no hurry and money isn’t too much of a barrier, either.
2 years? I doubt it. Look how
2 years? I doubt it. Look how long it took for 1440p@144Hz to show up and yes pg278q is the first one.