At recent AMD events, attendees were invited to try a blind sight test (an oxymoron if there ever was one) in which they had a chance to play on a AMD system with the new Vega GPU and Freesync as well as a machine powered by a GTX 1080 and G-Sync. The two machines and monitors were concealed so you could not tell which was which.
Seeing as how many of us did not have a chance to attend these conferences nor see the difference between the two, [H]ard|OCP decided to replicate the experiment, albeit with a GTX 1080 Ti in the G-Sync system. The two Windows 10 64-bit systems were powered by a AMD Ryzen 7 1800X CPU with 16GB of DDR4-2666MHz; the only difference was the GPU and display. The two displays were capable of up to a 100Hz refresh rate and the display settings were matched as well as humanly possible. The two monitors were a $720 ASUS MX34V with FreeSync and a $1300 ASUS PG348 G-Sync display, something worth noting for those with a shopping list.
Check out the video of the subjective experiences of the participants here, remembering that this is not exactly a rigid scientific experiment.
"Totally unscientific and subjective testing is scorned by many, so if that gets your panties in a bunch, I suggest you stop reading. We had the opportunity to preview AMD's RX Vega this weekend, and we put it up against NVIDIA's GTX 1080 Ti, both using 100Hz FreeSync and G-Sync panels, with our testers representing 223 combined years of gaming experience."
Here are some more Display articles from around the web:
- Acer Predator XB271HU bmiprz 144-165 Hz @ techPowerUp
- Philips BDM4350UC 43in 4k IPS @ Kitguru
- The Frame TV by Samsung Revealed! @ TechARP
100Mhz refresh rate? I want a
100Mhz refresh rate? I want a monitor like that (tho theres no graphics card that could run that)
Dammit! Fingers are too used
Dammit! Fingers are too used to MHz … fixing
If your output can do
If your output can do 1920x1080x50Hz, then it can do 100MHz updating… for a 1×1 display!
The part that make it a bit
The part that make it a bit more valid, is that they all seem to be avid / pro gamers.
So the result is, Doom at 4K… the 1080ti bring no noticeable benefits in minimum or average frame rate.
I guess if people would buy GPU like cars, after some test drive, AMD could make some money. But people buy GPU based on online reviews.
“The part that make it a bit
“The part that make it a bit more valid, is that they all seem to be avid / pro gamers.”
That’s about as helpful as asking a group of avid readers of Dan Brown their opinion on the security of The Louvre or Vatican City.
We have the technology, there is no need to aid companies in hawking their products through obfuscation.
No… its more like looking
No… its more like looking at a sheet of paper about 2 car and deciding based on that alone which one to buy.
This test is like having half a dozen race car driver give their personal opinion on the two different cars are handling after actual test drives.
Graph dont tell you about how minimum frame rate impact actual gameplay.
This test is very valid, and present valuable data. it just doesn’t replace, but compliment a full review.
Word to that. It’s the
Word to that. It’s the difference between a review on Top Gear (NOT scientific) and some specs from a dyno. One tells you scientifically rigorous information about performance, the other tries to tell you how it feels. Nobody plays games via Microsoft Excel, so I know which I prefer.
“Experiential testing” is the
“Experiential testing” is the sole domain of products that don’t win succeed in real scientific testing.
‘I have no proof this tinfoil hat keeps the aliens out of my mind, but I sure feel safer wearing it!’
shouldnt that quote be
‘I
shouldnt that quote be
‘I have no proof this tinfoil hat keeps the aliens out of my mind, but you cant prove it doesnt!’
Seems like freesync is the
Seems like freesync is the only option with hdr at $599. For less than$1000 you can buy a vega gpu and have a hdr experience. It seems that is the only advantage I can see from team red.
You have to be careful when
You have to be careful when doing this type of tests.
Human behavior is favoring the last thing tested. So you have to make sure you randomize the order of the test so that not all people do test system 1 first then system 2.
Kind of borked test.
a)
Kind of borked test.
a) gtx1080ti runs doom with vulkan over 120fps on that resolution(3440×1440) so out of gsync range. They would been better of by using gtx1080 or more demanding game.
b) VA vs IPS, different panel types. Color response and lag will differ, no matter how you try to calibrate them.
Good point. It’s too bad they
Good point. It’s too bad they couldn’t get two monitors that were at least based on the same panel technology.
for a “blind taste test” to
for a “blind taste test” to be anywhere a legitimate way of testing, they’d need a much larger test pool and would literally need the same panel tech in both monitors for this to be any kind of test to be taken seriously. might as well be comparing apples to oranges.
There you go! Nvidia could
There you go! Nvidia could add VESA Display Prot adaptive-sync support to all of its GPU SKUs. That’s a VESA standard that everybody should support!
I smell BS.
As said by
I smell BS.
As said by someone above, the GTX1080Ti should be almost 100% out of GSYNC range, with possibly the occasional dip below 100FPS.
The AMD setup on the other hand would be almost completely in FREESYNC range based on the testing I’ve seen.
In fact, the GTX1080 should feel better than the GTX1080Ti system since it should mostly be below 100FPS on Ultra at 3440×1440 thus in the tear-free, less-laggy GSYNC range.
*The fact that people can’t tell the difference is also baffling. These SHOULD feel different. The only way I can imagine they’d feel THE SAME would be to drop the settings so both stay above 100FPS (or are locked to 100FPS with VSYNC ON).
Or if they locked GSYNC and Freesync to 95FPS to stay within asynchronous range but locking FPS whether that way or VSYNC seems unlikely since that’s an unfair test too.
The only thing that makes sense is they are kissing AMD’s ass because I can not figure out how this was done without SEVERE BIAS.
(the monitors are also NOT comparable either as the Freesync monitor is 48Hz to 100Hz only so you’ll have a WORSE EXPERIENCE on the AMD setup any time you drop below 48FPS which of course you don’t do in this test but certainly would in many other games even when the FPS appears to be higher.)
While nobody said this was Apples to Apples, it’s still misleading if you don’t understand some of the details, and frankly it’s the job of “reporters” to give people the information they need rather than CHERRY-PICK details on behalf of a company that’s obviously paying them off. I believe HARDOCP is being paid off based on the article and the video.
“The only thing that makes
“The only thing that makes sense is they are kissing AMD’s ass because I can not figure out how this was done without SEVERE BIAS.”
That’s really not the only thing that makes sense… I’m as surprised as everyone here but they saw what they saw. Your comment about the variable refresh rate range makes no sense – VSYNC latency is low enough at 100hz that you wouldn’t feel the burn flipping between that and back to VRR at lower frame-rates. Evidently frame variability, an old AMD problem, isn’t such a biggie now.
“the monitors are also NOT comparable either as the Freesync monitor is 48Hz to 100Hz only so you’ll have a WORSE EXPERIENCE on the AMD setup”
You’re MISSING that AMD’s low frame-rate compensation would be ACTIVE below 48hz, so your EFFECTIVE range is 24-100hz. (random caps is fun)
“it’s still misleading if you don’t understand some of the details”
Everything is misleading if you don’t understand it. You must be the target audience they refer to in the article as getting really, really upset about this kind of testing.
Spunjii,
LFC does not work
Spunjii,
LFC does not work unless the max to min ratio of Freesync is at least 2.5x such as 40Hz to 100Hz, or 30Hz to 75Hz.
With this Freesync monitor you default to either VSYNC OFF (stutter/judder) or VSYNC ON (screen tearing) any time you drop below 48FPS.
(You also don’t understand how Freesync works, because it just RESENDS the same frame if you drop below the MIN and have LFC support. For example, if it was 30->75Hz range and you drop to 29FPS then you get 58FPS to drive back into asynchronous range but it’s still just 2x29FPS. So 29 actual different frames but no screen tear, no added VSYNC lag)
Not a huge issue if we have all the facts and can make a fully informed decision but we did not get this information.
I did try to edit my “lag” comment out but I could no longer edit, however there should still have been observable screen tear when out of asynchronous range so it’s hard to understand how the systems appeared the same to I guess 6/10 people.
You also mention VSYNC which probably wasn’t enabled, and AMD’s frame-time variability being solved? There’s minimal frame time variability issue when in Freesync mode; in fact every frame is a different length but since the monitor is slaved to the GPU it’s not much of a problem. So that probably doesn’t apply here.
And yes, I do get upset when any company does cherry-picked testing meant to mislead. I am NOT a fanboy though, and probably will be recommending VEGA if I feel it is the best overall value.
I just call BS when I smell it, and I smell it here. If I’m wrong then fine; I’m okay with that too.
Jeremy,
Please try to
Jeremy,
Please try to investigate the issues regarding my above comment. When watching the video some people talked about it being “easier to track” and “snappier” on the AMD system; I’ve got a strong feeling that’s because AMD was in the Freesync range and the NVidia system was not in the GSYNC range.
I’ve not tested myself as I don’t have a GTX1080Ti but I’ve seen other tests based on ULTRA settings at 3440×1440 and made my own estimates which keep coming back to this test being completely unfair, again due to one system being in asynchronous range mostly and the other not.
How HARDOCP could not know this baffles me, so are they misleading, incompetent, or did I fail to understand some aspect of the testing?
I wasn’t part of their
I wasn't part of their subjective test so I never laid eyes on what they describe.
I'd refer you back to the first sentence in the reveiw … "Totally unscientific and subjective testing is scorned by many, so if that gets your panties in a bunch, I suggest you stop reading."