NVIDIA RTX In Action – Ray Tracing Demos and FFXV DLSS
Given the RTX naming scheme of these new cards, its evident that a lot of the focus of this launch lays on NVIDIA's new RTX software features.
For those of you that haven't yet heard of RTX, you can give our architecture article a read, but essentially these new software SDKs provide a way to utilize the deep learning capabilities of the Tensor cores found in Volta, as well as the real-time raytracing capable RT cores.
While no shipping games have either the deep learning powered DLSS anti-aliasing technology, or the real-time ray tracing technology enabled, NVIDIA provided a few demo applications to test these technologies.
Final Fantasy XV Benchmark
For DLSS testing, NVIDIA and Square Enix provided an updated version of the Final Fantasy XV benchmark. While we were unable to set any custom settings, these benchmarks were both run at 4K, with the only image quality setting being TAA, the default AA option in the game, versus the new DLSS.
Using our capture equipment to test a 60-second portion of the beginning of the benchmark run, we recorded the average, 95th and 99th percentile frame rates with DLSS on and TAA on.
With DLSS enabled, we see a 36% gain in average frame rate versus TAA enabled with the RTX 2080 Ti.
It's nice to see DLSS working in the real world, but we would really like to get our hands on it in a scenario that is actually gameplay driven, and not a benchmark which never changes. Regardless, these performance results look promising for whenever DLSS-enabled games do ship.
Star Wars Reflection Demo
On the ray tracing side, we were provided by NVIDIA and the ILMxLab of a version of the Star Wars Reflection demo seen at GDC and the RTX launch event.
The video above was rendered in real time on our RTX 2080 Ti, with the default frame rate limited of 24FPS disable. Please note that the demo does maintain V-Sync on, despite an uncapped rendering frame rate.
3DMark Ray Tracing Preview
Another ray tracing demo we got a look at was from UL, makers of 3DMark. This demo is meant to be a preview of their upcoming benchmark that will make use of Microsoft's DirectX Ray Tracing APIs. This is still an early preview, but we are always delighted to see UL step up to the plate and develop benchmarks for new technologies.
So why did you not show us
So why did you not show us power consumption figures -AFTER- you overclocked the cards? From what I’m reading and understanding here… your displayed power consumption figures are only at stock with no overclocking applied? How can we know how much power it uses when overclocked if you wont show us? And we have to rely on you, pcper.com to show us because literally -NO ONE ELSE ON THE ENTIRE INTERNET- is showing any power consumption figures at all for the RTX 2080 Ti. So please, do show us how much it uses after being overclocked.
I see that Gold Award but it
I see that Gold Award but it sounds like a hard pass to me. Until prices drop, I don’t see why anyone would buy these cards.
This’s still confusing, so if
This’s still confusing, so if I’m coming from a GTX770 as a productivity not gaming user (architecture student), would this or the 1080ti make sense? like does anyone know if the ray tracing thing is gonna reflect into rendering software or does that seem like a far outcome and software developer dependant…I don’t really have the funds to drop on a buzzword tech and not actually be future proofing for anything.
You can check Puget Systems
You can check Puget Systems for some nice charts comparing rendering times with a variety of software you likely use, and cards. IF the 1080Ti holds to be similar to the 2080 (ray tracing not withstanding) you might expect “similar” performance… It seems some games do reasonably better on the 2080 so far. While several are improved only 5-7%. We have a lot left to learn through testing, and why some are not necessarily improved.
So, your $ might be better spent on the 1080Ti… at least with initial tests just starting to roll out.
2080Ti (still waiting) ?/*
I
2080Ti (still waiting) ?/*
I think I should have kept my NVIDIA Titan Xp, it got 17,600 on Passmark, much more, without a RAID0 Volume. But, CPU and Disk, is always first for speed, no doubt, don’t speak.
The scores posted on that web site look like what you get from 8 PCIE lanes, not 16 PCIE lanes. It is confusing, they always make the new devices look better, but, are they?
Just 17,600 vs. 14,800 right now (14,800 on passmark), and even worse without the RAID0 volume, 18,000 vs. there 14,800).
What are you supposed to believe for 2080 and 2080Ti exactly?
understands that there is
understands that there is software that improves video viewing through an NVIDIA video card
https://developer.nvidia.com/rtx/ngx
1. Not exactly explained on the site How does it work ?
Is it real-time movie viewing software or film production ?
2. Suppose I install the SDK, how do I run the software ? Driver that will give me high quality video ?
How does this quality?
3. What video card do I need to get the best video quality ?
4. Do you need a special movie player ? Do it work with YouTube ?
5. DLSS: What Does It Mean for Game Developers?
Is this software to create games? Not suitable for those who want to improve videos (in real time)
https://news.developer.nvidia.com/dlss-what-does-it-mean-for-game-developers/
Nothing is understood in this software
Would someone please test the
Would someone please test the 2080 TI via Cinebench? I’ve seen so many tests, but not one test of Cinebench.
What would be really nice is to compare the 2080 TI and the 1080 Ti’s performance via Cinebench……
Please?
Why are the cards not sorted
Why are the cards not sorted by performance?
Vega always put on bottom regardless if it beats 1080…
Is that supposed to be a psycho trick?