NVIDIA has published an article about GPU performance and its impact on gaming, specifically the ultra-popular battle royale variety. The emphasis is on latency, and reducing this when gaming with a combination of high FPS numbers and a 144 Hz (and higher) refresh display. Many of these concepts may seem obvious (competitive gaming on CRTs and/or lower resolutions for max performance comes to mind), but there are plenty of slides to look over – with many more over at NVIDIA's post.
"For many years, esports pros have tuned their hardware for ultra-high frame rates — 144 or even 240 FPS — and they pair their hardware with high refresh rate monitors. In fact, ProSettings.net and ProSettings.com report that 99% of Battle Royale Pros (Fortnite,PUBG and Apex Legends) are using 144 Hz monitors or above, and 30% are using 240 Hz monitors. This is because when you run a game, an intricate process occurs in your PC from the time you press the keyboard or move the mouse, to the time you see an updated frame on the screen. They refer to this time period as ‘latency’, and the lower your latency the better your response times will be."
While a GTX 750 Ti to RTX 2080 comparison defies explanation, latency obviously drops with performance in this example
"Working with pros through NVIDIA’s research team and Esports Studio, we have seen the benefits of high FPS and refresh rates play out in directed aiming and perception tests. In blind A/B tests, pros in our labs have been able to consistently discern and see benefits from even a handful of milliseconds of reduced latency.
But what does higher frame rates and lower latency mean for your competitiveness in Battle Royale? A few things:
- Higher FPS means that you see the next frame more quickly and can respond to it
- Higher FPS on a high Hz monitor makes the image appear smoother, and moving targets easier to aim at. You are also less likely to see microstutters or “skip pixels” from one frame to the next as you pan your crosshair across the screen
- Higher FPS combined with G-SYNC technologies like Ultra Low Motion Blur (ULMB) makes objects and text sharper and easier to comprehend in fast moving scenes
This is why for Battle Royale games, which rely heavily on reaction times and your ability to spot an enemy quickly, you want to play at 144 FPS or more."
One of the more interesting aspects of this article relates to K/D ratios, with NVIDIA claiming an edge in this are based on GPU performance and monitor refresh rate:
"We were curious to understand how hardware and frame rates affect overall competitiveness in Battle Royale games for everyday gamers – while better hardware can’t replace practice and training, it should assist gamers in getting closer to their maximum potential."
"One of the common metrics of player performance in Battle Royales is kill-to-death (K/D) ratio — how many times you killed another player divided by how many times another player killed you. Using anonymized GeForce Experience Highlights data on K/D events for PUBG and Fortnite, we found some interesting insights on player performance and wanted to share this information with the community."
For more on this topic, and many more charts, check out the article over at NVIDIA.com.
TeeHee –
Want to be a better
TeeHee –
Want to be a better player? get better hardware, and oh yea, take a look at our better hardware and some graphs to show you how much better it is than your older hardware.
TeeHee
Aren’t they rather shooting
Aren’t they rather shooting themselves in the foot with this, what with the performance hit that comes with enabling DXR, they seem to basically be saying don’t enable DXR because it will effect you K/D ratio.
Between this and the Logitech
Between this and the Logitech article, it must be a slow news day. I suppose there’s not much else to do except sub in some corporate promotions for nVidia (Look at our charts and graphs! Buy our graphics cards!) and Logitech (We’re “charitable”! Please like us!).
Wow if Nvidia wants to sell
Wow if Nvidia wants to sell more RTX GPUs then Nvidia needs to lower something else besides latency. And high frame rate G-Sync monitors are pricey also and that just adds to the costs of Nvidia gaming.
That Glut of GPUs clogging the channels is not going to be cleared up unless the Nvidia channel partners make with the discounts. It’s more about the K/D ratios on folks’ wallets and really many are tired of the too high GPU pricing. The miners are not around once again with that motherload of coin madness played out for the time being.
It sure looks like the next few business quarters in GPU land are not going to be anything the shareholders will be pleased with. But the used GPU market is sure taking to some heavy discounting with some of the first Google hits pointing to miners puttng their GTX SKUs up for sale at half/less than half of MSRP.
When Demand dries up then the prices must fall and fall even fruther to intice more Demand!
LOL okay, yep, IF i buy an
LOL okay, yep, IF i buy an expensive, more powerfulr video card, and IF i buy an expensive, higher refresh rate monitor… then my dollars spent will magically transform into skill?
well, i’ll be a monkey’s uncle, better head out to Walmart and pick me up one of them thar new gamin’ ‘puters so i can git gud, hyuck!
seriously?
they expect educated people to believe this?
LMAO!
Don’t forget that money is
Don’t forget that money is Bruce WAYNE’s superpower! L O L
It’s not that it directly
It’s not that it directly translates into skill, it’s that it raises the ceiling on your potential, which is actually legitimate.
“while better hardware can’t
“while better hardware can’t replace practice and training, it should assist gamers in getting closer to their maximum potential.”
If you would have actually read any of the information you would realize that your characterization is a complete strawman.
What nVidia is claiming is completely reasonable and everybody knows it. At some range, having better hardware is going to offer a significant increase in a player’s ability to play the game. At the higher end there will be diminishing returns.
The biggest factor that they show isn’t from a better GPU, but from a monitor with increased refresh rate. If you don’t think a professional gamer would see an improvement moving from 60Hz to 144-240Hz then you are just wrong.
“If you don’t think a
“If you don’t think a professional gamer would see an improvement moving from 60Hz to 144-240Hz then you are just wrong.”
It’s just funny since the LCD technology can barely display 40 images per second (i.e. black-to-black) and whatever screen manufacturers can print in their commercial sh!ts, this won’t change the laws of physics!
Adaptive sync displays also
Adaptive sync displays also add latency. When I went from Asus Vg27h at 120hz (1080p no gsync tn ) display on a gtx 1070 I was a better marksman compared to when I play the same game like Crysis 3 multiplayer on my current Asus pg348q 100hz ( gsync 3440x1440p ips) display with a 1080ti ftw3.
When you watch the killcam of your opponent you can clearly see that the opponent sees you way before you see them even at when I set my settings to get 150 fps. My connection is also a Gigabit fios connection which rules out any network latency at least on my end.
The 20hz difference going from 120 down to 100hz probably also is a likely culprit.
If you want to be competitive in multiplayer shooters I would suggest the 1080p 240hz displays strictly and the hardware to run it. You definitely have to consider all your hardware on not just the graphics card!
Update:
Don’t forget even the type of panel could save time on latency and input lag ( TN vs ips vs oled vs Va makes a difference ).
I’d love to see a study where
I’d love to see a study where they take a current high performance e-sports participant and do a series of blind tests on different hardware to see what the actual performance impact is. Something never mentioned is that people’s internal hand-eye latency is measured in hundreds of milliseconds (typically 250-350ms, see http://www.academia.edu/download/42108529/Reaction_time_latencies_of_eye_and_hand_20160204-30232-1rcu7rv.pdf), which completely swamps the latencies due frame rate, usb sample rates and monitor image processing time.
Link broke, see:
Link broke, see: http://www.academia.edu/download/42108529/Reaction_time_latencies_of_eye_and_hand_20160204-30232-1rcu7rv.pdf
And I give up, Google hand
And I give up, Google hand eye latency.
Something for you to consider
Something for you to consider is that even if your hand-eye latency is that high, you will still notice additional latency. Consider using VSYNC ON vs VSYNC OFF @ 60Hz. VSYNC ON adds 1 to 2 extra frames of input delay, which is roughly 16.7ms (at 1 frame). Most people can immediately notice this added delay with mouse movement.
Oh, FFS, graphic designers,
Oh, FFS, graphic designers, you can’t arbitrarily capitalize things.
Milliseconds are denoted by ‘ms’. ‘MS’ means ‘megasiemens’.