New Generation, New Founders Edition
Performance for Turing Unveiled!
At this point, it seems that calling NVIDIA's 20-series GPUs highly anticipated would be a bit of an understatement. Between months and months of speculation about what these new GPUs would be called, what architecture they would be based off, and what features they would bring, the NVIDIA GeForce RTX 2080 and RTX 2080 Ti were officially unveiled in August, alongside the Turing architecture.
We've already posted our deep dive into the Turing architecture and the TU 102 and TU 104 GPUs powering these new graphics cards, but here's a short take away. Turing provides efficiency improvements in both memory and shader performance, as well as adds additional specialized hardware to accelerate both deep learning (Tensor cores), and enable real-time ray tracing (RT cores).
RTX 2080 Ti | Quadro RTX 6000 | GTX 1080 Ti | RTX 2080 | Quadro RTX 5000 | GTX 1080 | TITAN V | RX Vega 64 (Air) | |
---|---|---|---|---|---|---|---|---|
GPU | TU102 | TU102 | GP102 | TU104 | TU104 | GP104 | GV100 | Vega 64 |
GPU Cores | 4352 | 4608 | 3584 | 2944 | 3072 | 2560 | 5120 | 4096 |
Base Clock | 1350 MHz | 1455 MHz | 1408 MHz | 1515 MHz | 1620 MHz | 1607 MHz | 1200 MHz | 1247 MHz |
Boost Clock | 1545 MHz/ 1635 MHz (FE) |
1770 MHz | 1582 MHz | 1710 MHz/ 1800 MHz (FE) |
1820 MHz | 1733 MHz | 1455 MHz | 1546 MHz |
Texture Units | 272 | 288 | 224 | 184 | 192 | 160 | 320 | 256 |
ROP Units | 88 | 96 | 88 | 64 | 64 | 64 | 96 | 64 |
Tensor Cores | 544 | 576 | — | 368 | 384 | — | 640 | — |
Ray Tracing Speed | 10 GRays/s | 10 GRays/s | — | 8 GRays/s | 8 GRays/s | — | — | — |
Memory | 11GB | 24GB | 11GB | 8GB | 16GB | 8GB | 12GB | 8GB |
Memory Clock | 14000 MHz | 14000 MHz | 11000 MHz | 14000 MHz | 14000 MHz | 10000 MHz | 1700 MHz | 1890 MHz |
Memory Interface | 352-bit G6 | 384-bit G6 | 352-bit G5X | 256-bit G6 | 256-bit G6 | 256-bit G5X | 3072-bit HBM2 | 2048-bit HBM2 |
Memory Bandwidth | 616GB/s | 672GB/s | 484 GB/s | 448 GB/s | 448 GB/s | 320 GB/s | 653 GB/s | 484 GB/s |
TDP | 250 W/ 260 W (FE) |
260 W | 250 watts | 215W 225W (FE) |
230 W | 180 watts | 250W | 292 |
Peak Compute (FP32) | 13.4 TFLOPS / 14.2 TFLOP (FE) | 16.3 TFLOPS | 10.6 TFLOPS | 10 TFLOPS / 10.6 TFLOPS (FE) | 11.2 TFLOPS | 8.2 TFLOPS | 14.9 TFLOPS | 13.7 TFLOPS |
Transistor Count | 18.6 B | 18.6B | 12.0 B | 13.6 B | 13.6 B | 7.2 B | 21.0 B | 12.5 B |
Process Tech | 12nm | 12nm | 16nm | 12nm | 12nm | 16nm | 12nm | 14nm |
MSRP (current) | $1200 (FE)/ $1000 |
$6,300 | $699 | $800/ $700 |
$2,300 | $549 | $2,999 | $499 |
As unusual as it is for them NVIDIA has decided to release both the RTX 2080 and RTX 2080 Ti at the same time, as the first products in the Turing family.
The TU102-based RTX 2080 Ti features 4352 CUDA cores, while the TU104-based RTX 2080 features 2944, less than the GTX 1080 Ti. Also, these new RTX GPUs have moved to GDDR6 from the GDDR5X we found on the GTX 10-series.
One of the most significant departures with the RTX 2080 and RTX 2080 Ti can be found in the newly redesigned NVIDIA Founders Edition products.
Finally moving away from the blower-style cooler, the Founders Edition cards now feature a dual axial fan design, along with a vapor chamber that's the entire length of the card, as well as a substantially redesigned look.
Utilizing this new found cooling capacity, the Founders Editions cards will also be shipped in a factory overclocked state for the first time, 90MHz for both SKUs.
Since NVIDIA is generally the only person that ships their graphics cards at reference clock speeds, it's difficult to predict if we'll see any RTX cards clocked lower than the Founders Editions, but NVIDIA is claiming that these cards are in fact "overclocked."
Another addition to the Founders Edition cards and all of the third party RTX cards we've seen so far is the addition of a USB-C connector. This USB-C connector is VirtualLink compliant, a standard developed through coordination with NVIDIA, Oculus, Valve, Microsoft, and AMD to provide a one-cable solution for next-generation VR headsets.
The VirtualLink port is capable of providing 4 Lanes of HBR3 DisplayPort, USB 3.1 Gen 2 connectivity, as well as 27W of power delivery.
The RTX 2080 features an 8-pin and a 6-pin power connector, while the RTX 2080 Ti moves to two 8-pin connectors.
The other funky new connector on the RTX cards is NVLink. This communication protocol, previously seen in NVIDIA's Tesla and Quadro products provides a high-bandwidth replacement for SLI and will enable resolutions up to 8K (Single Link on the RTX 2080), and 8K Surround (Dual Link on the RTX 2080 Ti). However, NVLink is only compatible with 2 GPUs, officially ending the days of 3 and 4-way SLI, even if Pascal only featured support in select benchmarks.
Unlike the launch of the Pascal-based GTX 10-series cards, graphics card designs from third-party manufacturers such as MSI, EVGA, and ASUS will be ready and shipping on the same launch date as the Founders Edition (September 20th).
Here's just a small taste of what's in store for the coming weeks as we take a look at these new third-party designs.
Review Terms and Disclosure All Information as of the Date of Publication |
|
---|---|
How product was obtained: | The product is on loan from NVIDIA for the purpose of this review. |
What happens to the product after review: | The product remains the property of NVIDIA but is on extended loan for future testing and product comparisons. |
Company involvement: | NVIDIA had no control over the content of the review and was not consulted prior to publication. |
PC Perspective Compensation: | Neither PC Perspective nor any of its staff were paid or compensated in any way by NVIDIA for this review. |
Advertising Disclosure: | NVIDIA has purchased advertising at PC Perspective during the past twelve months. |
Affiliate links: | This article contains affiliate links to online retailers. PC Perspective may receive compensation for purchases through those links. |
Consulting Disclosure: | NVIDIA is not a current client of Shrout Research for products or services related to this review. |
So basically, skip it and
So basically, skip it and wait for 30-series?
If you’ve got a fairly modern
If you’ve got a fairly modern card already (ie. 1080 or Vega64) then yeah pretty much.
Unless you’re loaded and don’t care about money at all then OK do whatever you want. For anyone else who works for a living a 2080 or 2080Ti is a real tough sell.
Insane pricing aside, it
Insane pricing aside, it seems like ray-tracing is not worth the performance hit. I’m surprised to see the 2080 actually beaten by the 1080Ti in a few cases. I’m sure that some will chalk it up to drivers, etc., but it doesn’t inspire confidence. That 2080Ti price is just absurd though.
This card teases ray-tracing
This card teases ray-tracing really. I believe the card has to run at 1080p resolution just to get average at best framerates. I’m personally going to wait till the next round where these features will start to become more mainstream, more efficient and run at higher resolutions.
You’re wrong on that, bud.
You’re wrong on that, bud. Check other reviews for 4k Ray tracing.
I’d if you are wanting ray
I’d if you are wanting ray tracing at 1440p or 4k at playable framerates, then most definitely skip this gen, and possibly even the next gen.
Honestly, if one doesnt’ care about ray tracing but you can get a higher tier card, then get yourself a VRR monitor and compatible GPU and call it a day. IMO, VRR makes much more of a difference anyway
Does this card support VRR? I
Does this card support VRR? I know it supports G-Sync, but VRR is a very specific HDMI 2.1 feature.
I’ll wait a lot longer. Since
I’ll wait a lot longer. Since before the 1080 ti my last card purchase was a 460.
Is DLSS supposed to look
Is DLSS supposed to look similar to TAA?
DLSS is actually supposed to
DLSS is actually supposed to match the visual quality of 64x SSAA. We haven't had a whole lot of time (or software) to dig into this with, but it's something we plan on focusing on soon!
Okay good, that’s what I
Okay good, that’s what I originally thought. If a big game somebody loves gets this AA and it works as advertised, then that alone could be enough to warrant upgrading. But that remains to be seen.
Have we stopped doing video
Have we stopped doing video reviews for launches now? Or do we just get the roundup in the podcast? I kinda liked those.
I’m really waiting for The
I’m really waiting for The Verge’s review, they seem to really know what they are talking about.
Isn’t “the Verge” the same
Isn’t “the Verge” the same media outlet that did that insanely bad pc build guide? And then proceeded to block people for criticizing their so-called guide?
Yes
Yes
Yep, that’s why he
Yep, that’s why he sarcastically said that.
It’s a terrible tech blog that has has always been superficial and often facetious, the epitome of which was that video. Their owner, Vox media, owns multiple equally superficial and generally terrible blogs like polygon.
Best comment of the day!
Best comment of the day!
Running 970 and will wait for
Running 970 and will wait for 2060 series of cards to be released and finally upgrading out of 900 series of cards.
No 1440p results?
No 1440p results?
It’s interesting to see the
It’s interesting to see the reviews for this poping up, I think Linus summed it up with “It seems rushed”
At the end of the day, they are betting the farm on real-time ray tracing, and now I wonder how much seed money they will/already have spread to every corner of the game world to keep said farm.
Question, this ray tracing, is it proprietary or is it open standard? I’ve ready plenty on how it works and why its good but I’m unaware if it’s a possible future industry standard, or a possible future monopoly.
“Question, this ray tracing,
“Question, this ray tracing, is it proprietary or is it open standard?” Accessible via DXR (DirectX) or Vulkan RT (Vulkan).
I’m going to skip this series
I’m going to skip this series of cards. Maybe the 3000 series will offer something worth the cost of the upgrade.
Have fun with 60% of the
Have fun with 60% of the speed of the fastest graphics card on the market for the next two years.
I can’t speak for him, but
I can’t speak for him, but I’m having plenty of fun with my 970. I can wait for the 30-series and Navi (both within the next year) before making my decision.
Does SLI scale better with
Does SLI scale better with Nvlink than old bridge? Are AIBs or thinner Founders less hot in SLI? Useless to me with no SLI data
GTX 980Ti MSRP $649
card
GTX 980Ti MSRP $649
card released next generation with similar performance
GTX 1070 MSRP $379 with FE card $449
Now
GTX 1080Ti MSRP $699 only price listed on wiki
card released next generation with similar performance
RTX 2080 MSRP $699 with FE card $799
That’s what I am seeing here. That’s quite the early adopter tax for ray tracing “coming soon”
Prices taken from wiki
This was like a pin in my
This was like a pin in my hype balloon…
I guess I’ll be skipping this gen.
Come on Nvidia this feels like a pre-order for a product for which most of the good features are “Coming Soon to a RTX card near you”.
Does titian V support DLSS
Does titian V support DLSS
The price of the 2080 fell to
The price of the 2080 fell to $749 at microcenter for msi FE
And $649 for non reference 1080ti by gigabyte
Newegg has a vega 64 selling for $499
Fyi dont overpay!
Nvidia’s got some great new
Nvidia’s got some great new Technology just like AMD’s Vega has that Explicit Primitive Shader IP/etc. The differce being that Nvidia has the billions of dollars extra to provide direct Games/Gaming engine development support so that Nvidia’s New Ray Tracing/AI-Tensor-Core IP will get adopted and used in games.
AMD bit of more than they could financially chew with trying to create that Implicit Primitive Shader legacy gaming code conversion process in order to, on the fly, allow legacy games to make use of Vega’s In GPU hardware Explicit Primitive Shaders IP.
Nvidia is not even trying to do any sorts of Implicit conversion of any legacy games on the fly so that legacy games can make use of any of Turing’s RTX Ray Tracing/Tensor core AI IP. And that’s because even Nvidia would not succeed with that complex of a software task. So Nvidia is just helping the games developers explicitly target Nvidia’s New Ray Tracing and AI/Tensor Core IP for new games.
AMD’s focused too much time and resources on that Implicit Primitive Shaders coding/conversion task’s software engineering for legacy games and AMD should have instead spent the money to assist the Games developers in Explicitly Targeting Vega’s Explicit Primitive Shader Hardware IP, and forget about legacy games.
So Now things Primitive Shader related will not be ready until Navi is to market as far as targeting the Explicit Primitive Shaders IP that will also be in Navi’s GPU Micro-Arch. Hopefully once AMD gets the Eplicit Primitive Shader IP support into the Graphics APIs/its GPU drivers then that support should be back portable to Vega. AMD is still going to have to support the games/gaming engine developers directly if AMD wants its GPU featuers utilized by games.
The best GPU hardware featuers are useless no matter the GPU maker involved unless there is active support from that GPU’s maker to assist the games/gaming engine makers and the Graphics API makers in adopting any new GPU hardware based IP. Nvidia will spend the billions necessary because Nvidia has the billions to get the job done. AMD has great technology inside Vega but AMD lacks the billions at the moment to get Vega’s IP fully utilized as quickly as Nvidia can. No One’s software/firmware enineers work for free and a lot of those folks can command 6 figure salaries or they can easily go and work some place that will pay six figures+.
I can see the Console Games makers being the first to make use of AMD’s Explicit Primitive Shader IP but that will be done Via Microsoft’s and Sony’s/Others’ dime more than AMD’s. Nvidia’s got some new Shader IP also in Turing so that’s kind of similar to what AMD’s Primitive Shader IP does. But software engineers don’t work for free so expect those costs to be passed on to the consumer.
And Still to this Day Joe Gamer does not Know the difference between the Implicit kind of support that is done via a software/middleware conversion process on the Fly and via graphics APIs and the Explicit kid of directly in the hadrware kind of support that exists that is supported by the drivers and the graphics APIs.
Joe Gamer is even whining about Nvidia’s Ray Tracing/AI IP as much as Joe Gamer whines about AMD’s Explicit primitive Shaders. And there is only one solution and That’s for the GPU makers to make the gamers pay more for all the extra costs involved for the GPU Makers in supporting the games/gaming engine makers to tweak their games.
Nvidia increasing its GPUs MSRP/ASPs will help AMD in the long run also. And there are still plenty of older generation GPUs available from both Nvidia and AMD. So if gamers just want to Increase their FPS metrics only and do not care about any new IP, well gamers can get Dual GPUs and game that way with both Vega and Pascal based GPUs and get the extra FPS metrics to brag to Vern about!
Really Nvidia is advancing Gaming for everyone with that new RT Cores/AI Cores IP the very same way that AMD is for Explicit Primitive Shaders that will be of use in games. It always takes time and some folks will always have the funds while others will not. So some will have to wait for the price to come down and game a generation behind, and there really is nothing wrong with that.
Is there any info on which
Is there any info on which USB controller the card is using?
It’s internal on the GPU, I
It's internal on the GPU, I think using the same USB logic they integrated into Tegra.
Thanks!
Curious about the
Thanks!
Curious about the behaviour with VR headsets since they are known to be picky. Of course future headsets with this connector will be tested on NV cards, so not likely to be a problem.
Also, is it possible to plug standard USB storage there?
how is Rt any different from
how is Rt any different from modified T&L?? i did not see a difference in the demo.
Summary:
2080=1080ti with ray
Summary:
2080=1080ti with ray tracing.
2080ti is incredibly fast, but not priced for mere mortals.
7nm stuff next year will be really fast.
Awesome review Pcper
Happy with my 1080ti 2139/5800 under water-
Superposition benchmarks:
1080p Extreme 6,400
4k optimized 10,400
There is no frickin way this
There is no frickin way this deserves a Gold Award. This literally is the most forced, lame Gold Award I have seen in the history of this site.
It is a Gold Award – because we probably should..or repercussions.
Sure, but does anyone take
Sure, but does anyone take what award something gets into consideration, when making a purchasing decision? I don’t.
But people who dole out repercussions at Nvidia probably only care about the conclusions page and a shiny badge puts them at ease.
Definitely a Silver for the
Definitely a Silver for the 2080Ti, possibly Bronze.
Pro:
2080Ti is the fastest gaming card. Pure power. In a vacuum, that would be worth a platinum award. However…
Cons:
Exorbitant price – laughable even
Louder
Hotter
More power hungry
Limited OC demonstrated
No DLSS support at launch
No Ray-tracing support at launch
The 2080 barely deserves an honorable mention. Sub-$500 1080Ti are showing up in the used market already. Once the 30-series comes out, there’ll be a firesale on any remaining 10-series in the wild and the 20-series will be like a fart on the wind.
Out of curiousity, what GPU
Out of curiousity, what GPU is it you feel the fastest available consumer graphics card? That is the verbatim reason the award was given.
Your personal "Gold Award" may have different criteria; we've specified why it gets ours, just like various version of Titans have in the past.
Agreed. Just because its
Agreed. Just because its still technically the “fastest” doesn’t mean you automatically have to give the product the highest rating. I think we’ve gotten so accustomed to Nvidia leading the charts with minimal effort that it’s just become an automatic yawn slap sticker.
The pricing to performance ratio is really bad for being a next-gen card. Could also be showing GPU’s are finally starting to hit the CPU/transistor brick wall, which was bound to happen sooner or later.
@Cellar Door, agreed.
“For
@Cellar Door, agreed.
“For being the fastest available consumer graphics card, the NVIDIA GeForce RTX 2080 Ti receives the PC Perspective Gold Award. While this GPU is quite pricey at it’s $1200 price tag, the performance benefits over the RTX 2080, and previous GTX 10-series GPUs should provide an excellent PC gaming experience for years to come.”
For years to come? With <60fps @ 1080p for RTX, that's only the case with current raster games. Let's not forget DLSS uses low res textures better called a blurfest, quincunx anyone? DLSSx2 has no perf gains over TAA. It's definitely useful to get into dev hands, though. 7nm Ampere will perform where Turing is constrained (even with the large die sizes). Prices won't fall with 7nm, unfortunately...
The £400 I recently paid for
The £400 I recently paid for my 1080Ti is looking really good value right now, for 1440p/165Hz monitor it’s absolutely perfect. I have no need for HDR or Ray Tracing so the pricing of the new cards is just over the top for the technology that I’d likely never switch on in the first place.