Formidable Mid-Range
Overview and initial testing of the latest GeForce GPU
We have to go all the way back to 2015 for NVIDIA's previous graphics card announcement at CES, with the GeForce GTX 960 revealed during the show four years ago. And coming on the heels of this announcement today we have the latest “mid-range” offering in the tradition of the GeForce x60 (or x060) cards, the RTX 2060. This launch comes as no surprise to those of us following the PC industry, as various rumors and leaks preceded the announcement by weeks and even months, but such is the reality of the modern supply chain process (sadly, few things are ever really a surprise anymore).
But there is still plenty of new information available with the official launch of this new GPU, not the least of which is the opportunity to look at independent benchmark results to find out what to expect with this new GPU relative to the market. To this end we had the opportunity to get our hands on the card before the official launch, testing the RTX 2060 in several games as well as a couple of synthetic benchmarks. The story is just beginning, and as time permits a "part two" of the RTX 2060 review will be offered to supplement this initial look, addressing omissions and adding further analysis of the data collected thus far.
Before getting into the design and our initial performance impressions of the card, let's look into the specifications of this new RTX 2060, and see how it relates to the rest of the RTX family from NVIDIA. We are taking a high level look at specs here, so for a deep dive into the RTX series you can check out our previous exploration of the Turing Architecture here.
"Based on a modified version of the Turing TU106 GPU used in the GeForce RTX 2070, the GeForce RTX 2060 brings the GeForce RTX architecture, including DLSS and ray-tracing, to the midrange GPU segment. It delivers excellent gaming performance on all modern games with the graphics settings cranked up. Priced at $349, the GeForce RTX 2060 is designed for 1080p gamers, and delivers an excellent gaming experience at 1440p."
RTX 2080 Ti | RTX 2080 | RTX 2070 | RTX 2060 | GTX 1080 | GTX 1070 | |
---|---|---|---|---|---|---|
GPU | TU102 | TU104 | TU106 | TU106 | GP104 | GP104 |
GPU Cores | 4352 | 2944 | 2304 | 1920 | 2560 | 1920 |
Base Clock | 1350 MHz | 1515 MHz | 1410 MHz | 1365 MHz | 1607 MHz | 1506 MHz |
Boost Clock | 1545 MHz/ 1635 MHz (FE) |
1710 MHz/ 1800 MHz (FE) |
1620 MHz 1710 MHz (FE) |
1680 MHz | 1733 MHz | 1683 MHz |
Texture Units | 272 | 184 | 144 | 120 | 160 | 120 |
ROP Units | 88 | 64 | 64 | 48 | 64 | 64 |
Tensor Cores | 544 | 368 | 288 | 240 | — | — |
Ray Tracing Speed | 10 Giga Rays | 8 Giga Rays | 6 Giga Rays | 5 Giga Rays | — | — |
Memory | 11GB | 8GB | 8GB | 6GB | 8GB | 8GB |
Memory Clock | 14000 MHz | 14000 MHz | 14000 MHz | 14000 MHz | 10000 MHz | 8000 MHz |
Memory Interface | 352-bit GDDR6 | 256-bit GDDR6 | 256-bit GDDR6 | 192-bit GDDR6 | 256-bit GDDR5X | 256-bit GDDR5 |
Memory Bandwidth | 616 GB/s | 448 GB/s | 448 GB/s | 336.1 GB/s | 320 GB/s | 256 GB/s |
TDP | 250 W / 260 W (FE) |
215W / 225W (FE) |
175 W / 185W (FE) | 160 W | 180 W | 150 W |
MSRP (current) | $1200 (FE)/ $1000 |
$800 (FE)/ $700 |
$599 (FE)/ $499 | $349 | $549 | $379 |
The Founders Edition Card
The design of the RTX 2060 Founders Edition is essentially a scaled-down version of the RTX 2080/2080 Ti we reviewed back in September, with dual fans and a very attractive wrap-around metal construction that makes this look – and feel – like a premium product. Aesthetics are a personal thing, but I really like the design of NVIDIA’s Founders Edition cards this generation, and in addition to their sleek appearance they boast some very effective cooling with surprisingly quiet fans.
One area of interest is the power connector, and not only because this is an 8-pin rather than the 6-pin PCIe found on last generation’s reference GTX 1060, but due to the position of the connector itself:
Placing it along the rear edge brings back memories of the old Radeon HD 5770 reference card design, and while possibly serving a practical purpose (allowing a slightly wider path for warm air to move past the heatsink fins along the top?) it also helps keep things looking tidy in that trendy tempered glass enclosure you have your eye on (you know the one).
Now that we have looked over the new card and made debatable observations about aesthetics, let's move on to page two to begin exploring some benchmark results using both synthetic and game benchmarks.
This actually looks
This actually looks interesting. Will have to see what the prices end up at once it reaches me here in Sweden, but given it keeps on the lower side of 4000SEK it actually might be the card I buy to replace my 970. That card is over 3 years old now, but still mostly good for 1920*1080 gaming.
Given the numbers in this review it seems the 2060 has roughly 2X performance for only slightly more money.
It looks good at the moment
It looks good at the moment but 6GB will limit RTX2060 to 1080p in just a year or so. For 1080p gaming 350USD seems a bit too much.
Citation needed. There are no
Citation needed. There are no indications that 6GB of RAM will limit this card at 2.5k resolutions any time soon.
If the next gen consoles are
If the next gen consoles are released, it could start being an issue in games sooner than later.
Hammering requested, so…
Hammering requested, so… Highlight the 2060 in the charts.
(pls)
Yes! Highlighting or a
Yes! Highlighting or a different color to make it easier to spot, and better colors for the charts in general. It will be done.
That is good to hear. It’s
That is good to hear. It’s literally why I registered at this site.
BTW, Congratulations on taking over the site. I wish you all the best!
“it also helps keep things
“it also helps keep things looking tidy in that trendy tempered glass enclosure you have your eye on (you know the one).”
Hit the nail on the head. I’m shopping right now between 2-3 itx white tempered glass chassis’ to hold a new VR-on-the-go setup.
I heard nvidia has unlocked
I heard nvidia has unlocked adaptive sync on this card, I’d be very curious to some tests on freesync monitors if that would be possible.
There will be a driver in a
There will be a driver in a week or so that will let GTX 10-series and RTX 20-series cards support certain monitors for adaptive sync. They have a list, out of 400 tested screens 12 did qualify. There will also be an option to turn it on for any screen, but no promises how it will work out.
https://www.nvidia.com/en-us/geforce/news/g-sync-ces-2019-announcements/
I like this review format so
I like this review format so far, maybe you won’t ruin the site after all… (kidding…(not kidding…))
So… we need to wait for an
So… we need to wait for an eventual RTX 2050 for a real mid-range card at real mid-range price.
That’s basically my thinking:
That's basically my thinking: 1050/1050 Ti sucessors will be very interesting if we see similar gains. This feels like a step above "midrange" for sure.
Price, yes. Performance,
Price, yes. Performance, maybe.
It’s just a clear sign that there isn’t enough competition for Nvidia. Not good for consumers.
I was hoping for a 2060 GTX. I would MUCH prefer a $250,- 2060 GTX over a $350,- 2060 RTX.
I have a MSI GTX 1070 Gaming
I have a MSI GTX 1070 Gaming X that I bought for around $290 when the mining craze ended after selling my Aorus Extreme GTX 1060 for the exact same amount.
Would there be any advantages for me to upgrade? I’m gaming on a full HD 60hz monitor.
Will you be performing
Will you be performing testing using Ryan/Ken’s frame rating technique or will you rely on in game benchmarks and other API tools? I always liked the frametime and variance graphs since it helped visualize screen stutter.
I have been using OCAT, which
I have been using OCAT, which will allow for frame rating testing similar to the FCAT results of the past. I will continue to work on a solution that is easy to follow visually when I get back from CES.
At $350 it’s NOT a mainstream
At $350 it’s NOT a mainstream card. A mainstream card is at $200 – $250. This is an expensive card, out of the price range of the mainstream category. The fact that offers performance at GTX 1070/Ti level is an excuse for it’s price, NOT for putting it in the mainstream category.
Anyway, I see many sites and bloggers playing Nvidia’s song. Nvidia is trying to move all prices up and unfortunately tech sites are helping. Soon we will be reading about 2030 at $150 being a low end card and the excuse would be “look, it scores better than the 1050.
PS GTX 970 was a hi end card at $330
a gallon of gas used to cost
a gallon of gas used to cost 25cents. cell phones used to cost $300.
trolling aside, point well made and well taken. my 7970, the king of cards at the time, most expensive one could purchase…$500. woof
hey, at least ssd’s and ddr4 is coming down. mydigital bpx pro 480 gb ftw look at them speeds and endurance ratings mr samsung 970 pro hehe 😉
It’s more cheaper than other
It’s more cheaper than other cards
Sebastian, isn’t making sure
Sebastian, isn’t making sure you’re not CPU bound, the first step in testing a GPU accurately? It seems like you’re severely CPU bound on Ashes of the Singularity @ 1080p and Final Fantasy XV @ 1080p. How is a Vega64 faster than a 2080? Maybe it’s time to start testing with a overclocked 9700k/9900k. RAM speeds have a great impact on certain engines as well so using a DDR4-3400 or DDR4-3600 should help in those situations as well.
Sure, 1080p presents some
Sure, 1080p presents some issues testing with CPU bound games, and I could just drop lower res testing for them. The alternative is to use higher detail settings for games that present those issues and verify that GPUs scale as expected. I'm planning to re-test because of that. Hard to retest from the airport when I made those charts.
I don't really think a Core i7-8700K with 3000 MHz DDR4 is going to be a bottleneck for most gamers, and this setup is intended to be more realistic, even though we could use a super high-end platform instead.
This is exciting. Was really
This is exciting. Was really holding off from buying another 10XX card and did not want an expensive 2070 or 2080. 350 doesn’t seem so bad. Cannot wait to order. Wish they allowed for pre-orders or something.
I totally agree. People are
I totally agree. People are so funny in here. Seems like a very solid card. Enjoy!
This is far too expensive.
This is far too expensive. Approaching three years since the release of the GTX 1070, Nvidia have released a follow up that offers… the same price and performance of a GTX 1070. This represents a complete halting of any kind of measurable progress, like the entire Turing product line.
This is incorrect, it is not
This is incorrect, it is not the same card as 1070 as they can’t do raytracing, DLSS, VRS, etc and these will all either improve quality, or speed, or in some cases both.
You don’t seem to understand the difference between GTX cards and RTX.
https://www.youtube.com/watch?v=9mxjV3cuB-c
Watch that vid from digitalfoundry and you’ll see massive speed improvements for VRS, DLSS etc.
https://www.youtube.com/watch?v=E2pMxJV4xr4
Can’t do this with a 1070 right? Another 2060 vid, DXR testing, note he says another patch is coming with potentially more perf for new RTX features.
“I totally agree with anyone
“I totally agree with anyone who suggests that we need to run tests as 2560×1440 as well, and this is something that will be addressed.” -Sebastian
I totally agree with you totally agreeing with me.