3DMark Time Spy, Unigine Superposition, Final Fantasy XV
Before getting to the first synthetic benchmarks I will stress that the results presented in the coming pages of this review are by no means complete, and are offered to give an idea of performance so far. The story will not be complete until overclocking results and ray tracing performance are tested in depth. It can also be argued that additional cards should be added to the mix, especially on the AMD side, and I wholeheartedly agree. The same goes for 2560×1440 resolution testing, as this was omitted in the interest of time as I wanted to at least cover 1920×1080 and 3840×2160 to give a low/high comparison.
I began this review with a totally clean slate (zero benchmarks from the previous history of the site were reused), and the unfortunate side-effect of this was that time simply did not permit enough testing before CES put a hard stop to my efforts. Still, valuable data was captured and will be put to good use in the coming weeks and months.
GPU Test Platform | |
---|---|
Processor | Intel Core i7-8700K |
Motherboard | ASUS ROG Z370-H Gaming |
Memory | Corsair Vengeance LED 16GB (2x8GB) DDR4-3000 |
Storage | Samsung 850 EVO 1TB |
Power Supply | Corsair RM1000x 1000 watt |
OS | Windows 10 x64 Version 1803 (RS4) |
Drivers | AMD: 18.50 NVIDIA: 417.35, 417.54 (RTX 2060 tests) |
Now we'll take a look at synthetic benchmarks, beginning with 3DMark Time Spy.
If testing ended here it would appear that the RTX 2060 bests not only the GTX 1070 and 1070 Ti, but the GTX 1080 as well. This is not the case in every test, as you will see, but the RTX 2060 is a formidable card nonetheless.
Next we have Unigine Superposition, run here using the DirectX mode and with high preset settings for the 1920×1080 resolution and medium settings at 3840×2160.
While the RTX 2060 fits in neatly between the performance of the GTX 1070 and 1080 in the 1920×1080 benchmarks, the story changes somewhat with the move to 3840×2160, with the RTX 2060 showing similar performance to the Vega 64 in this test.
Next we shift gears to a benchmark that is not synthetic, but still comes prepackaged as a standalone benchmarking application: Final Fantasy XV. For this game, as well as all games to follow, standard presets were used (no custom detail settings). We begin with a look at FPS numbers from each of the cards:
Expections were easily met, with the game showcasing the potential of the RTX 2060 compared to previous generation GTX cards, and offering massive gains over the Pascal-based GTX 1060 6GB from last year.
Will the frame times tell the same story? Here's a look at both average and 99th percentile frame times with FFXV at 1920×1080:
And now the benchmark again, this time at UHD resolution and the standard quality preset:
It is interesting to see the relationship between FPS averages and frame times, as the overall "smoothness" of a game will depend on the relationship of average to 99th percentile frame times (with the lower – and more consistent – results making for a better experience).
On the next page we continue our game benchmarks with the RTX 2060.
This actually looks
This actually looks interesting. Will have to see what the prices end up at once it reaches me here in Sweden, but given it keeps on the lower side of 4000SEK it actually might be the card I buy to replace my 970. That card is over 3 years old now, but still mostly good for 1920*1080 gaming.
Given the numbers in this review it seems the 2060 has roughly 2X performance for only slightly more money.
It looks good at the moment
It looks good at the moment but 6GB will limit RTX2060 to 1080p in just a year or so. For 1080p gaming 350USD seems a bit too much.
Citation needed. There are no
Citation needed. There are no indications that 6GB of RAM will limit this card at 2.5k resolutions any time soon.
If the next gen consoles are
If the next gen consoles are released, it could start being an issue in games sooner than later.
Hammering requested, so…
Hammering requested, so… Highlight the 2060 in the charts.
(pls)
Yes! Highlighting or a
Yes! Highlighting or a different color to make it easier to spot, and better colors for the charts in general. It will be done.
That is good to hear. It’s
That is good to hear. It’s literally why I registered at this site.
BTW, Congratulations on taking over the site. I wish you all the best!
“it also helps keep things
“it also helps keep things looking tidy in that trendy tempered glass enclosure you have your eye on (you know the one).”
Hit the nail on the head. I’m shopping right now between 2-3 itx white tempered glass chassis’ to hold a new VR-on-the-go setup.
I heard nvidia has unlocked
I heard nvidia has unlocked adaptive sync on this card, I’d be very curious to some tests on freesync monitors if that would be possible.
There will be a driver in a
There will be a driver in a week or so that will let GTX 10-series and RTX 20-series cards support certain monitors for adaptive sync. They have a list, out of 400 tested screens 12 did qualify. There will also be an option to turn it on for any screen, but no promises how it will work out.
https://www.nvidia.com/en-us/geforce/news/g-sync-ces-2019-announcements/
I like this review format so
I like this review format so far, maybe you won’t ruin the site after all… (kidding…(not kidding…))
So… we need to wait for an
So… we need to wait for an eventual RTX 2050 for a real mid-range card at real mid-range price.
That’s basically my thinking:
That's basically my thinking: 1050/1050 Ti sucessors will be very interesting if we see similar gains. This feels like a step above "midrange" for sure.
Price, yes. Performance,
Price, yes. Performance, maybe.
It’s just a clear sign that there isn’t enough competition for Nvidia. Not good for consumers.
I was hoping for a 2060 GTX. I would MUCH prefer a $250,- 2060 GTX over a $350,- 2060 RTX.
I have a MSI GTX 1070 Gaming
I have a MSI GTX 1070 Gaming X that I bought for around $290 when the mining craze ended after selling my Aorus Extreme GTX 1060 for the exact same amount.
Would there be any advantages for me to upgrade? I’m gaming on a full HD 60hz monitor.
Will you be performing
Will you be performing testing using Ryan/Ken’s frame rating technique or will you rely on in game benchmarks and other API tools? I always liked the frametime and variance graphs since it helped visualize screen stutter.
I have been using OCAT, which
I have been using OCAT, which will allow for frame rating testing similar to the FCAT results of the past. I will continue to work on a solution that is easy to follow visually when I get back from CES.
At $350 it’s NOT a mainstream
At $350 it’s NOT a mainstream card. A mainstream card is at $200 – $250. This is an expensive card, out of the price range of the mainstream category. The fact that offers performance at GTX 1070/Ti level is an excuse for it’s price, NOT for putting it in the mainstream category.
Anyway, I see many sites and bloggers playing Nvidia’s song. Nvidia is trying to move all prices up and unfortunately tech sites are helping. Soon we will be reading about 2030 at $150 being a low end card and the excuse would be “look, it scores better than the 1050.
PS GTX 970 was a hi end card at $330
a gallon of gas used to cost
a gallon of gas used to cost 25cents. cell phones used to cost $300.
trolling aside, point well made and well taken. my 7970, the king of cards at the time, most expensive one could purchase…$500. woof
hey, at least ssd’s and ddr4 is coming down. mydigital bpx pro 480 gb ftw look at them speeds and endurance ratings mr samsung 970 pro hehe 😉
It’s more cheaper than other
It’s more cheaper than other cards
Sebastian, isn’t making sure
Sebastian, isn’t making sure you’re not CPU bound, the first step in testing a GPU accurately? It seems like you’re severely CPU bound on Ashes of the Singularity @ 1080p and Final Fantasy XV @ 1080p. How is a Vega64 faster than a 2080? Maybe it’s time to start testing with a overclocked 9700k/9900k. RAM speeds have a great impact on certain engines as well so using a DDR4-3400 or DDR4-3600 should help in those situations as well.
Sure, 1080p presents some
Sure, 1080p presents some issues testing with CPU bound games, and I could just drop lower res testing for them. The alternative is to use higher detail settings for games that present those issues and verify that GPUs scale as expected. I'm planning to re-test because of that. Hard to retest from the airport when I made those charts.
I don't really think a Core i7-8700K with 3000 MHz DDR4 is going to be a bottleneck for most gamers, and this setup is intended to be more realistic, even though we could use a super high-end platform instead.
This is exciting. Was really
This is exciting. Was really holding off from buying another 10XX card and did not want an expensive 2070 or 2080. 350 doesn’t seem so bad. Cannot wait to order. Wish they allowed for pre-orders or something.
I totally agree. People are
I totally agree. People are so funny in here. Seems like a very solid card. Enjoy!
This is far too expensive.
This is far too expensive. Approaching three years since the release of the GTX 1070, Nvidia have released a follow up that offers… the same price and performance of a GTX 1070. This represents a complete halting of any kind of measurable progress, like the entire Turing product line.
This is incorrect, it is not
This is incorrect, it is not the same card as 1070 as they can’t do raytracing, DLSS, VRS, etc and these will all either improve quality, or speed, or in some cases both.
You don’t seem to understand the difference between GTX cards and RTX.
https://www.youtube.com/watch?v=9mxjV3cuB-c
Watch that vid from digitalfoundry and you’ll see massive speed improvements for VRS, DLSS etc.
https://www.youtube.com/watch?v=E2pMxJV4xr4
Can’t do this with a 1070 right? Another 2060 vid, DXR testing, note he says another patch is coming with potentially more perf for new RTX features.
“I totally agree with anyone
“I totally agree with anyone who suggests that we need to run tests as 2560×1440 as well, and this is something that will be addressed.” -Sebastian
I totally agree with you totally agreeing with me.