The Waiting Game
The first retail ready NVIDIA G-Sync monitor is finally reviewed, the ASUS PG278Q ROG Swift.
NVIDIA G-Sync was announced at a media event held in Montreal way back in October, and promised to revolutionize the way the display and graphics card worked together to present images on the screen. It was designed to remove hitching, stutter, and tearing — almost completely. Since that fateful day in October of 2013, we have been waiting. Patiently waiting. We were waiting for NVIDIA and its partners to actually release a monitor that utilizes the technology and that can, you know, be purchased.
In December of 2013 we took a look at the ASUS VG248QE monitor, the display for which NVIDIA released a mod kit to allow users that already had this monitor to upgrade to G-Sync compatibility. It worked, and I even came away impressed. I noted in my conclusion that, “there isn't a single doubt that I want a G-Sync monitor on my desk” and, “my short time with the NVIDIA G-Sync prototype display has been truly impressive…”. That was nearly 7 months ago and I don’t think anyone at that time really believed it would be THIS LONG before the real monitors began to show in the hands of gamers around the world.
Since NVIDIA’s October announcement, AMD has been on a marketing path with a technology they call “FreeSync” that claims to be a cheaper, standards-based alternative to NVIDIA G-Sync. They first previewed the idea of FreeSync on a notebook device during CES in January and then showed off a prototype monitor in June during Computex. Even more recently, AMD has posted a public FAQ that gives more details on the FreeSync technology and how it differs from NVIDIA’s creation; it has raised something of a stir with its claims on performance and cost advantages.
That doesn’t change the product that we are reviewing today of course. The ASUS ROG Swift PG278Q 27-in WQHD display with a 144 Hz refresh rate is truly an awesome monitor. What did change is the landscape, from NVIDIA's original announcement until now.
What is G-Sync? A Quick Refresher
Last year, I spent a lot of time learning about the technology behind NVIDIA G-Sync and even spoke with several game developers in the build up to the announcement about its potential impact on PC gaming. I wrote an article that looked at the historical background of refresh rates and how they were tied to archaic standards that are no longer needed in the world of LCDs, entitled: NVIDIA G-Sync: Death of the Refresh Rate. We also have a very in-depth interview with NVIDIA’s Tom Petersen that goes through the technology in an easy to understand step by step method that I would encourage readers watch for background on the game-changing feature in this display.
The idea of G-Sync is pretty easy to understand, though the implementation method can get a bit more hairy. G-Sync introduces a variable refresh rate to a monitor, allowing the display to refresh at wide range of rates rather than at fixed intervals. More importantly, rather than the monitor dictating what rate this refresh occurs at to the PC, the graphics now tells the monitor when to refresh in a properly configured G-Sync setup. This allows a monitor to match the refresh rate of the screen to the draw rate of the game being played (frames per second) and that simple change drastically improves the gaming experience for several reasons.
Gamers today are likely to be very familiar with V-Sync, short for vertical sync, which is an option in your graphics card’s control panel and in your game options menu. When enabled, it forces the monitor to draw a new image on the screen at a fixed interval. In theory, this would work well and the image is presented to the gamer without artifacts. The problem is that games that are played and rendered in real time rarely fall into a very specific frame rate. With only a couple of exceptions, games frame rates will fluctuate based on the activity happening on the screen: a rush of enemies, a changed camera angle, an explosion or falling building. Instantaneous frame rates can vary drastically, from 30, to 60, to 90, and force the image to be displayed only at set fractions of the monitor's refresh rate, which causes problems.
If a frame takes more than the standard refresh time of a monitor to draw (if the frame rate drops low) then you will see a stutter, or hitch, in the game play that is caused by the display having to re-draw the same frame for a second consecutive interval. Any movement that was tracking on the screen would suddenly appear to stop – and then would quickly “jump” to the next location faster than your mind thought it should. V-Sync also inherently introduces input latency to games if these lag and stutters take place.
The common alternative for gamers worried about latency and stutter was to disable V-Sync in the control panel or in game. This solves the stutter and latency issues (kind of) but cause another, much more noticeable issue, called "tearing". With V-Sync disabled, the graphics card sends a new frame to the monitor at any point in the monitors refresh cycle, even if the LCD is currently drawing. The result is that, at some point down the screen, the user sees the previous frame above, and the current frame below, at a sharp line. You will literally be seeing an image where the geometry is no longer lined up and, depending on the game and scene, can be incredibly distracting.
Monitors with refresh rates higher than 60 Hz reduce this tearing by having more frequent screen refreshes, and thus a tear is less likely to occur in any single refresh cycle, but the tearing is impossible to remove completely.
NVIDIA G-Sync switches things up by only having the monitor refresh its screen when a new frame is ready from the GPU. As soon as the next frame is drawn it can be based to the display and drawn on the screen without tearing. If the next frame is ready in 16ms, it can be sent immediately. If it takes 25ms or only 10ms, it doesn’t matter, the monitor will wait for information from the GPU to draw the new frame. The result is an incredibly smooth and fluid animation that doesn’t stutter and doesn’t tear.
There are a couple of fringe cases that NVIDIA has needed to build for, including frame times below 33ms (under 30 FPS), where the image on the panel might visibly darken or decay if it isn’t refreshed automatically, even if a new frame isn’t ready. Also, some games have issues with G-Sync (Diablo III, for example, doesn’t have a true full screen mode) and have to be disabled, either through a profile or manually to avoid artifacts.
Ultra Low Motion Blur Technology
Another feature present on the ASUS PG278Q monitor is ULMB, or Ultra Low Motion Blur. Original built as part of the NVIDIA 3D Vision infrastructure, ULMB is a technology that is used to decrease motion blur on the screen and remove or reduce ghosting of fast moving images. It does this by turning on the backlight in time with the screen refresh and then quickly darkening the backlight after the pixel has been “strobed”. The effect is that, with ULMB enabled, images are sharper and appear to have less motion blur from frame to frame.
This sounds great! But the side effect is a much lower total brightness perceived by the gamer on screen. Just as we saw with 3D Vision throughout its development, enabling this mode effectively drops the light output of the screen by half. For some gamers and in some situations, this trade off will be worth it. Particular games like RTS, that include lots of small text of units that scroll across the scene very quickly, can see dramatic sharpness increases.
It's difficult to capture with stills, but animations are darker, but shaper, with ULMB
It’s important to note that ULMB can only be used when G-Sync is not enabled and it only works at 85 Hz, 100 Hz, and 120 Hz. Most games, at least in my experiences thus far, will see much more benefit from the variable refresh rate technology of G-Sync than they will with ULMB. If brightness is a concern (like playing in a well lit room) then ULMB could be a non-starter as the halved light output will be very noticeable.
Enabling ULMB is as easy as navigating the monitor menu and selecting it and you’ll be able to adjust the strobe pulse width. I tested the capability through fantastic website called testufo.com that offers users a host of options to test the motion blur of their displays. It was easy to find instances in which the ULMB feature allowed for sharper animations but the brightness variance was also very apparent.
Got one pre-ordered, then end
Got one pre-ordered, then end of the month cant come soon enough!
500$ lol
500$ lol
Nice to see progress made
Nice to see progress made with LCDs, but there are still too many disadvantages with this monitor. I’ll stick with my FW900s for awhile longer; 15 years and it still hasn’t been surpassed.
Im Selling a Kidney to get
Im Selling a Kidney to get this monitor, anyone interested?.. Great job Asus! Would love to play Battlefield 3 & 4 on this beast some day in the not too distant future!!
Hey Ryan, Excellent Video showcase and review, the best one out there, very informative!!
ASUS ROG Swift PG278Q is not
ASUS ROG Swift PG278Q is not worth your money; wait!
* Missing ASUS EyeCare Technology (very bad)
* Poor Panel Choice, a TN monitor, colors fades with the move of your head
* One connector, no legacy video connections, one video input, DisplayPort only
* Ultra Low Motion Blur (ULMB) does not work in conjunction with G-Sync
* Inconsistent Bezel Thickness: bottom is much thicker than the sides and top
* Missing ASUS SPLENDID Video Intelligence Technology
* 2560 x 1440, 16:9; should be 2560 x 1600, which is 16:10
Sincerely, Joseph C. Carbone III; 25 August 2014
Are you the kind of idiot
Are you the kind of idiot that want’s a brand new Ferrari for $20K and expects 100mpg from it and the ability to ferry 5 kids around?
* Poor Panel Choice, a TN monitor, colors fades with the move of your head
TN is a TN. TN is used for this monitor because of the vastly superior refresh rates and response times. Which is one of the key selling points of this monitor. If they have of used an IPS panel then the monitor wouldn’t have this level of performance (speed).
If you wan’t to edit photos etc, this is not a good monitor, I agree.
* Ultra Low Motion Blur (ULMB) does not work in conjunction with G-Sync
No one said it did, Asus clearly state as much. It’s a shame it doesn’t – I assume there is a technical reason?
* One connector, no legacy video connections, one video input, DisplayPort only
Doesn’t both me personally, if you wanted VGA (on a 2560×1440? LOL) then fair enough, check the specs. I believe it needs a DP port as HDMI doesn’t have the bandwidth to run this res at 144hz.
* Inconsistent Bezel Thickness: bottom is much thicker than the sides and top
Agree, this is annoying if you want to put monitors above/below or side by side in a vertical position
This is your only valid point as fair as I’m concerned, as I don’t believe they make this clear before purchase.
* 2560 x 1440, 16:9; should be 2560 x 1600, which is 16:10
No they should use 4K, 8k, 1080p, 1920×1200….. why SHOULD IT BE 16:10 rather than 16:9 ? do you just WANT it to be 16:10 therefore you shall decree ?
Maybe some people prefer 16:9… did you consider that?
I would hate to live in your utopia!
So far it seems like the new
So far it seems like the new driver released today (344.11) solved the out of range bug. Installed it this AM and I’ve been running at 144 Hz all day heavily gaming with no problems.
I am more than happy with the monitor. Yeah it’s over priced (hopefully it’ll come down a bit in price later) but GSync is just a total game changer.
The panel, despite being TN, is stunning.
As previous poster noted, it’s odd that they chose to make the bottom wider then the other three sides of the bezel. Not a real problem for me, or most people, but it would ruin the effect if you wanted to do 4 or 6 display multimonitor in a rectangle.
I´ve also bbought an Asus
I´ve also bbought an Asus VG248QE for getting G-sync for it later.
So i live in germany Europe, and i don´t see any chance or way to get a G-sync module!!!!
i´m member of several pc-hardware forums and i know quite alot about making my own gaming pc´s with buying every single part( choosing them before),
Every second month i order an update for my pc…..
How can i order a g-sync modul for the asus VG 248QE??????
Can i order it from california??
2. question: when i have 2 MSI GTX 970 Gaming in SLI, is it possible to take a dual-gpu GTX 760 ROG Mars for physx???
If not which are the best gpu´s for physx with a GTX 970SLI??
Would quadro or tesla be an good choice???
3.question does there come a GTX 990 with 2 GTx 980 GPU`S and 2x 4GB??????
Will there be any GTX 980 with 8GB? like EVGA Classified 8GB???