Ryan was not the only one to test the ASUS ROG Swift PG278Q G-Sync monitor, Overclockers Club also received a model to test out. Their impressions of the 27" 2560 x 1440 TN panel were very similar, once they saw this monitor in action going back to their 30-inch 60Hz IPS monitor was not as enjoyable as once it was. The only bad thing they could say about the display was the MSRP, $800 is steep for any monitor and makes it rather difficult to even consider getting two or more of them for a multiple display system.
”When you get down to it, the facts are that even with a TN panel being used for the high refresh rate, the ASUS ROG Swift PG278Q G-Sync monitor delivers great picture quality and truly impressive gaming. I could go on all day long about how smooth each of the games played while testing this monitor, but ultimately not be able to show you without having you sit at the desk with me. No stuttering, no tearing, no lag; it's like getting that new car and having all the sales hype end up being right on the money. When I flip back and forth between my 60Hz monitor and the PC278Q, its like a night and day experience.”
Here are some more Display articles from around the web:
- AOC G2460PG G-Sync 144Hz 1ms Gaming Monitor @ Kitguru
- Asus ROG Swift PG278Q 144hz G-Sync Monitor @ Kitguru
- 6400×1080: Testing Mixed-Resolution AMD Eyefinity @ eTeknix
- Demystifying NTSC Color And Progressive Scan @ Hack a Day
nothing is free.
especially
nothing is free.
especially nothing awesome is free.
Nor is amd’s so called “free” solution
WTF if your point?
Well, AMD’s Free-sync is
Well, AMD’s Free-sync is better in that it doesn’t buffer frames after scanning them, which means it doesn’t add 1 or 2 frames of input lag to the display’s input lag.
Oh, and you don’t need to buy a $150-$200 higher-priced monitor with a special Nvidia kit to get dynamic refresh rate, you just need to buy a monitor with DP 1.2a and higher, which doesn’t cost any extra money to implement in a monitor, unlike Nvidia’s G-Synic kit.
that “lag” as you call it is
that “lag” as you call it is PR put out by AMD so can’t believe everything AMD says given their history of making claims.
With AMD’s idea you have to buy a video card if you don’t have one that costs 150$ or higher to use their’s so in the end cost is around the same.
On side note, buying Nvidia people expect they will pay more for their stuff tends to have more R&D in it. If you don’t like it well don’t buy it, its your money.
It’s not PR. Nvidia admitted
It’s not PR. Nvidia admitted that their implementation will cause at least 1-frame lag on MaximumPC.
G-SYNC is just a cheap attempt by Nvidia to make a quick buck.
Between spending $150-200 on buying an overpriced G-SYNC monitor and buying a new Radeon, I’d choose the latter.
that 150$ radeon is only like
that 150$ radeon is only like 260x/265 so well not really that great of a deal. other then those you are buying a 290 series so.
1 frame of lag at 144 fps is only 7ms. wow that is so much latency that i will get totally owned in everything.
cheap attempt huh? AMD has yet to prove their crap even works cept for controlled demo’s by them showing videos.
It was at least one frame
It was at least one frame lag. That translates to 20-25ms of extra input lag added to your monitors signal processing lag. It’s quite noticeable on those 28″ 60Hz 4K monitors.
source? would love to see a
source? would love to see a real free sync vs G-sync comparison proving your theories.
i would love to know where
i would love to know where you get 1 frame is 20-25ms, 1 frame at 60hz is 16ms. Other thing is if that 1 frame of buffer is really a buffer or if its just there to keep for the display incase a mass drop happens cause something happens in game causes fps to drop a ton.
hmm…. i think we are not
hmm…. i think we are not fully aware yet of the bonuses of intelligent two way communication vs free syncs one way communication approach.
getting popcorn
….
getting popcorn
….
yeah, that’s a pretty
yeah, that’s a pretty troll-tastic headline considering the discourse surrounding this topic.
amd fanboyz will come out
amd fanboyz will come out just to attack the topic while ignoring that amd’s solution isn’t much cheaper if any.
Watching brand loyalists go
Watching brand loyalists go at it is like watching two, four year olds have a fight. You have fun now.
I see its mostly AMD fans
I see its mostly AMD fans trashing nvidia any chance they get, nvidia releases new tech, 5sec later a post bashing them for something. even when its AMD article they find was to bash nvidia. With them doing that tends to start the war. Even with all people that try to bash nvidia, most ppl vote to BUY nvidia with their money, case in point look at steams hardware survey numbers.
The majority of steam users
The majority of steam users use Intel IGPs. So most people buy Intel rather then AMD or NVIDIA.
I thought GSync doesn’t work
I thought GSync doesn’t work with multi displays anyway? So Nvidia knows it will at most sell one? 🙂
Fanbois need to fuck off in
Fanbois need to fuck off in general…
Noone cares about your worthless opinion.
Ill spend my money the way i feel like it.
Gsync is here now and its awesome.
We dont even have a prototype monitor for fresync, no clue if it even works as good as gsync. What companies SAY is meaningless.
Question – Is a multi-monitor
Question – Is a multi-monitor setup supported with G-sync? Or is it 1 card, 1 monitor? In that case, would a tri-SLI system be able to (hypothetically) drive 3 G-sync monitors? Am I asking a dumb question when I ask if that also means that all 3 monitors need to be in sync with each other as well as the graphics cards driving them?
So, shoudn’t we add the cost
So, shoudn’t we add the cost of the card on G-Sync too? With this monitor you have to spend up to 1k dollars to have G-Sync. Wooo.
It’s the same idea for-sync tech, I’d have to buy a new card along with a new monitor.
It’s the same idea for FreeSync as well. There are plenty of unsupported AMD cards that won’t work with Freesync, such as my 7970. If I wanted either
I’m sure when DisplayPort Adaptive-Sync enabled monitors are released that they will have a premium over regular monitors, but will it be as much as Gsync’s release?
It seems obvious to me that
It seems obvious to me that variable refresh rate should be part of the display standards. I should not be locked into a specific manufacturer for video card and display; this is why standards exist. If you want a proprietary solution get an Apple computer with their thunderbolt only display. The display will basically not work with anything except a thunderbolt enabled mac. Thunderbolt is the only input port it has and they seem to have deliberately disabled this from working with just a display port input.
G-Sync will work on multiple
G-Sync will work on multiple monitors BUT apparently it needs a card per monitor so a surround setup will require 3 cards.