Detailed Power Consumption and Overclocking
For power consumption measurements, we are still using the methodology that helped us discover the power draw issues of the RX 480 back in 2016.
Our testing involves the use of a National Instruments data acquisition device, which is intercepting the power being sent through the PCI Express bus (12V and 3.3V) as well as the ATX power connectors (12V) before they go to the graphics card and are directly measuring power draw.
On the game side, we are continuing to use Metro: Last Light for power measurement, given its history of being particularly taxing on GPUs in 4K.
Taking a look at total power draw across the lineup, we can notice a few interesting aspects. The RTX 2080 draws ever so slightly less power than the GTX 1080 Ti. Considering the minor performance gap between these two cards, it points to little gained in the way of efficiency moving from Turing to Pascal.
The RTX 2080 Ti hovers around 250W of power usage, about 25W more than the RTX 2080 and GTX 1080 Ti.
Taking a deeper look at the RTX 2080 Ti we can find that NVIDIA is actually not utilizing the 3.3V provided from the PCI Express slot, in favor exclusively of 12V from the slot and the additional power connectors.
Overclocking
One of the new software level features of the RTX GPUs is NVIDIA Scanner. This NVIDIA-developed platform allows for automated overclocking and tested of overclocking settings. While NVIDIA Scanner seems promising and is enabled in the beta version of EVGA's new Precision X1 software, it was a bit rough around the edges at the moment. Due to this, we decided to take the manual approach to overclocking.
RTX 2080 Founders Edition
For the RTX 2080, we were able to achieve an overclock of +160 MHz, which allowed us to hit a sustained clock speed of just over 2000 MHz.
As we can see, clocks speeds remain quite stable over the duration of our test in both the stock and overclocked configurations. With our overclock applied, temperatures rise about 5 degrees Celsius from stock to around 75 degrees.
RTX 2080 Ti Founders Edition
Overclocking the RTX 2080 Ti provides a similar frequency result as the RTX 2080. With a +190 MHz clock offset, we were able to achieve stable frequencies in the low 2000MHz range.
In an overclocked state, the Founders Edition RTX 2080 Ti was a bit hotter than the RTX 2080, at 80 degrees Celsius.
Considering the 90 MHz "stock overclock" of the Founders Edition cards, I am pleasantly surprised at the additional overclocking headroom that seems to be evident in Turing. It remains to be seen if third-party cards will be better at overclocking, or will manage to keep the card cooler while doing so, but we are eager to take a look at how those cards fare soon!
So why did you not show us
So why did you not show us power consumption figures -AFTER- you overclocked the cards? From what I’m reading and understanding here… your displayed power consumption figures are only at stock with no overclocking applied? How can we know how much power it uses when overclocked if you wont show us? And we have to rely on you, pcper.com to show us because literally -NO ONE ELSE ON THE ENTIRE INTERNET- is showing any power consumption figures at all for the RTX 2080 Ti. So please, do show us how much it uses after being overclocked.
I see that Gold Award but it
I see that Gold Award but it sounds like a hard pass to me. Until prices drop, I don’t see why anyone would buy these cards.
This’s still confusing, so if
This’s still confusing, so if I’m coming from a GTX770 as a productivity not gaming user (architecture student), would this or the 1080ti make sense? like does anyone know if the ray tracing thing is gonna reflect into rendering software or does that seem like a far outcome and software developer dependant…I don’t really have the funds to drop on a buzzword tech and not actually be future proofing for anything.
You can check Puget Systems
You can check Puget Systems for some nice charts comparing rendering times with a variety of software you likely use, and cards. IF the 1080Ti holds to be similar to the 2080 (ray tracing not withstanding) you might expect “similar” performance… It seems some games do reasonably better on the 2080 so far. While several are improved only 5-7%. We have a lot left to learn through testing, and why some are not necessarily improved.
So, your $ might be better spent on the 1080Ti… at least with initial tests just starting to roll out.
2080Ti (still waiting) ?/*
I
2080Ti (still waiting) ?/*
I think I should have kept my NVIDIA Titan Xp, it got 17,600 on Passmark, much more, without a RAID0 Volume. But, CPU and Disk, is always first for speed, no doubt, don’t speak.
The scores posted on that web site look like what you get from 8 PCIE lanes, not 16 PCIE lanes. It is confusing, they always make the new devices look better, but, are they?
Just 17,600 vs. 14,800 right now (14,800 on passmark), and even worse without the RAID0 volume, 18,000 vs. there 14,800).
What are you supposed to believe for 2080 and 2080Ti exactly?
understands that there is
understands that there is software that improves video viewing through an NVIDIA video card
https://developer.nvidia.com/rtx/ngx
1. Not exactly explained on the site How does it work ?
Is it real-time movie viewing software or film production ?
2. Suppose I install the SDK, how do I run the software ? Driver that will give me high quality video ?
How does this quality?
3. What video card do I need to get the best video quality ?
4. Do you need a special movie player ? Do it work with YouTube ?
5. DLSS: What Does It Mean for Game Developers?
Is this software to create games? Not suitable for those who want to improve videos (in real time)
https://news.developer.nvidia.com/dlss-what-does-it-mean-for-game-developers/
Nothing is understood in this software
Would someone please test the
Would someone please test the 2080 TI via Cinebench? I’ve seen so many tests, but not one test of Cinebench.
What would be really nice is to compare the 2080 TI and the 1080 Ti’s performance via Cinebench……
Please?
Why are the cards not sorted
Why are the cards not sorted by performance?
Vega always put on bottom regardless if it beats 1080…
Is that supposed to be a psycho trick?