NVIDIA GeForce RTX 4090 Founders Edition Review
An Insanely Fast Graphics Card
It’s here, and after several days of benchmarking I can confirm that NVIDIA’s Founders Edition of GeForce RTX 4090 lives up to the hype. Well, the performance hype, anyway. There has been talk that the card itself is larger than a game console (ok, it’s bigger than a Switch), or that it might draw enough power to put a strain on your average household circuit. That was all nonsense, but this Founders Edition version is a less power hungry card than what will see from some partner designs.
Yes, there will be two basic types of RTX 4090, with this Founders Edition falling into the 450-watt TGP category, and partners able to configure up to a 600-watt TGP. Still, 450 watts is no joke, though we have already seen this total board power figure from NVIDIA with the previous gaming flagship, the RTX 3090 Ti. However, transient power draw (momentary spikes) is more demanding on a PSU with the RTX 4090, so perhaps not all 450W TGPs are created equal.
You can compare the GeForce RTX 4090 Founders Edition’s specifications to a few other recent NVIDIA graphics cards in the table below:
|RTX 4090||RTX 3090 Ti||RTX 3090||RTX 3080|
|Base Clock||2235 MHz||1670 MHz||1395 MHz||1440 MHz|
|Boost Clock||2520 MHz||1860 MHz||1695 MHz||1710 MHz|
|Memory||24GB GDDR6X||24GB GDDR6X||24GB GDDR6X||10GB GDDR6X|
|Memory Data Rate||21 Gbps||21 Gbps||19.5 Gbps||19 Gbps|
|Memory Bandwidth||1 TB/s||1 TB/s||936 GB/s||760 GB/s|
|Die Size||608 mm^2||628 mm^2||628 mm^2||628 mm^2|
|Process Tech||TSMC 4nm NV Custom||Samsung 8nm NV Custom||Samsung 8nm NV Custom||Samsung 8nm NV Custom|
You will probably notice the massive increase in CUDA Core count from the previous flagship, with the RTX 4090 at 16384 cores, up from 10752 with the RTX 3090 Ti. The SM count moves up from 84 to 128 with the RTX 4090, and the new Ada Lovelace flagship offers 512 4th Gen Tensor Cores (up from 336 / 3rd Gen) and 128 3rd Gen RT Cores (up from 84 / 2nd Gen).
It may seem as though memory specs are identical to the RTX 3090 Ti, as both sport 24GB of 21 Gbps effective GDDR6X on a 384-bit bus, but an important distinction is a data point from NVIDIA’s new Ada Lovelace architecture not on this legacy format table: L2 cache size. This moves from 6144 KB (~6 MB) from the RTX 3090 Ti to a whopping 73728 KB (~70 MB) from the RTX 4090. Cache obviously makes a difference, as we saw from AMD with the RX 6000 Series.
The GeForce RTX 4090 is being produced on a TSMC 4nm NVIDIA custom process, down from the custom 8nm Samsung process of the RTX 30 Series. And as massive as this monolithic RTX 4090 GPU is, with a truly astonishing transistor count of 76.3 billion – up from 28.3 billion with the RTX 3090 Ti, the die size is actually down slightly from the RTX 3090 Ti (from 628 mm^2 to 608 mm^2).
As to pricing, assuming availability at MSRP for all cards, if we consider the $1599 RTX 4090 in relation to an RTX 3090 Ti and RTX 3090, it doesn’t look like a bad value. Consider this: NVIDIA has not released a TITAN card since the $2499 RTX TITAN in the 20 Series era, with the RTX 3090 taking its place at the top of the RTX 30 Series stack – until the RTX 3090 Ti later on, of course. If the RTX 4090 is this generation’s “TITAN” level product, its $1599 MSRP is $900 less than the last official TITAN, and $400 less than the $1999 RTX 3090 Ti (if you could ever find it at that price).
Wrapping up this intro, we don’t have a deep dive into Ada Lovelace today, and there are certainly more aspects to this new architecture to cover besides gaming performance, not the least of which concerns advancements in the content creation category. We will have to revisit the GPU with a look at performance with the Studio Driver, including video production. It seems that the RTX 40 Series offers dual encoders which, “together with the new algorithm to automatically split frames combined with architectural performance improvements allow HEVC and AV1 encoding to be up to 2x faster on 40-Series GPUs than on 30-Series GPUs”, according to NVIDIA.
The RTX 4090 Founders Edition Card
Without a doubt, this is a big graphics card. It tips the scales at nearly five pounds, occupies three full slots (and wants one more for breathing room), and while this is close to the same dimensions of the RTX 3090 Ti, the RTX 4090 is equipped with slightly larger fans (115 mm) than its predecessor.
Yes, it requires a lot of power. No, you may not need a new PSU.
The most talked about aspect of this GPU is power draw, and while 450 W is on the low side of RTX 4090 TGP it’s still novel to observe this gargantuan card somehow managing its 450W input via a single 12VHPWR connector. Of course many of us (myself included) will rely, at least for now, on the included 4x PCIe 8-pin adapter, dramatically photographed below:
We were supplied with a very powerful current-gen PSU from be quiet! for our review, the Dark Power Pro 12 1500W, which was overkill but certainly instilled confidence going into the review. And speaking of be quiet!, they have a post about the RTX 40 Series in which they explain that no, you don’t need a new power supply if your current PSU provides the recommended power (they go into a lot more detail in the post than I just did).
This review marks yet another new beginning in the history of PC Perspective GPU benchmarking, as I have decided to discard all legacy tests and move to a new platform. We were fortunate enough to receive the latest and greatest from AMD, and with a Ryzen 9 7950X we shouldn’t be CPU bound for a long time.
Some might argue in favor of the Ryzen 7 5800X3D, but we don’t have one (our 5800X3D-equipped Tiki test system went back to Falcon Northwest a long time ago), and while we have a Core i9 12900KS on hand, 13th Gen Core processors are right around the corner, but not out quite yet. Thus, we proceed with the X670E platform.
|PC Perspective GPU Test Platform|
|Processor||AMD Ryzen 9 7950X (Stock)|
|Motherboard||MSI MEG X670E ACE
BIOS v1.25 Beta
AGESA ComboPI 220.127.116.11 Patch A
Resizable BAR Enabled
|Memory||32GB (16GBx2) G.Skill Trident Z NEO DDR5-6000 CL30|
|Storage||SK Hynix Platinum P41 2TB NVMe SSD|
|Power Supply||be quiet! Dark Power Pro 12 1500W|
|Operating System||Windows 11 Pro, 21H2|
|Drivers||GeForce Game Ready Driver 521.90|
The move to a brand new platform also helps explain the minimal compliment of graphics cards in the charts below, as everything had to be re-tested with this system. And there are no AMD cards here, partly because I didn’t have time, and partly because the highest-end SKU we have is a Radeon RX 6800 XT. It would have ended up in the middle of the pack in each chart, and I probably would have been asked “where’s the RX 6900 XT?” or “why no RX 6950 XT?”, to which I would reply “we were never sampled beyond the RX 6800 XT”.
Ideally, ALL relavant high-end graphics cards would be tested for a GPU launch like this, but I can’t do that this time. I don’t like it, and it makes the review less compelling, but here we are.
It may be of interest to see how performance has evolved over the past four years of RTX cards, and to this end I have freshly re-tested the GeForce RTX 2080 Ti, RTX 3080 Ti, RTX 3090, and the new RTX 4090. (We don’t have an RTX 3090 Ti to test). I think you’ll find the RTX 4090 to have just a bit of an advantage here, and it’s only going to get bigger as the drivers mature and more games gain support for DLSS 3.0.
Some Benchmark Results
First, a look at 3DMark Time Spy Extreme:
At first I thought I messed up and clicked on “Time Spy” instead of “Time Spy Extreme”. Nope. It just looks like a regular Time Spy score. With a GPU score averaging over 19,000 (all results are the avg. of three runs), this RTX 4090 offers some serious muscle. Let’s see if this result is any indication of results with a real game engine.
For our first look at Cyberpunk 2077 performance, I set the quality to the “Ultra” preset, and there is no DLSS or other resolution scaling in effect.
(Update 10/12/22: I should say, I believe there is no resolution scaling in effect, because I set it to “off”. But Cyberpunk has a mind of its own, it seems, and these numbers seem high (we aren’t the only one, at least). Regardless, all four of the cards were tested with identical settings on this system, so the comparison is still valid. Also, Cyberpunk 2077 v1.60 uses AMD FSR by default. Even though we turned it “off”. So maybe these numbers are just the new normal.)
As in the 3DMark test above, performance is about double that of the RTX 3090. More than double in this case, actually. How about another title?
Not quite double the average FPS of the RTX 3090 this time, but still really, really impressive. I’m already convinced; the RTX 4090 is a beast, and can provide unheard-of frame rates even without DLSS 3.0.
How about one more title, again without any DLSS?
In DiRT 5 at 3440×1440 / ultra we are seeing over 200 FPS from the RTX 4090, with the RTX 3090 all the way down around 120 FPS. It’s a 70% increase this time, but the bigger key for me is how smooth this is, with even the 1% lows well above my display’s 144 Hz refresh rate.
Benchmarks with DLSS
Next we have to look at some DLSS results, with older benchmarks as well as something that can take advantage of the DLSS 3.0 / frame generation feature of the RTX 40 Series. I’ll start with the trusty Bright Memory Infinite benchmark, which is built for these RTX cards.
Not quite 2x the performance of the RTX 3090, but damned impressive nonetheless. It was a big deal (to me, anyhow) when the RTX 3090 SUPRIM could offer over 60 FPS in this demanding test, after the RTX 2080 Ti only managed about 35 FPS. Here the RTX 4090 is averaging well over 100 FPS – though we see a slight regression in 1% lows. I suspect the press driver was not optimized for this benchmark.
Let’s revisit a couple of games from the above results, beginning with Metro Exodus – this time with DLSS enabled, and set to “Balanced”.
As impressive as raw performance is, the first DLSS result is almost disappointing. We are still looking at an increase of 46%, but after seeing both raster and ray traced increases in the 70% and higher range, I feel like there’s some performance left on the table here.
What about Cyberpunk 2077 again?
That’s better – an increase of 69% over the RTX 3090 in Cyberpunk at the Raytraced Ultra preset with DLSS set to “Balanced”. But this isn’t all the RTX 4090 is capable of. We haven’t seen the insanity that is DLSS 3.0 with frame generation enabled. Here’s an example:
The above chart is a comparison of performance with the RT Ultra preset again, but this time without DLSS, with DLSS set to Balanced, and then with DLSS 3.0 Performance + Frame Generation enabled. The numbers speak for themselves, but a chart isn’t going to tell the story with this sort of technology, as visual fidelity obviously demands first-hand experience. Without watching slow motion / zoomed footage or looking at screenshots, I can only offer my subjective impressions.
I am not a DLSS fanboy, and I was pretty critical of when it was a first-gen product. DLSS 2.0 reached the point where it was hard for me to tell that it was enabled, particularly in the “Quality” mode where I absolutely can not tell it from native in real time. To me, DLSS 3.0 Performance looks as good as DLSS 2.0 Quality, but maybe I’m just bad at picking up telltale artifacts in real time. But that isn’t even the biggest part of this test result. Frame generation was enabled, and even though I started at the screen through multiple benchmark runs, I never saw any sign of interpolation – no “soap opera” effect, no blurriness with objects in motion, etc. It’s scary how effective the frame generation is.
First of all, there seems to be something amiss about DLSS performance in the press build of Flight Simulator, as there was virtually no improvement with DLSS Balanced over TAA, with all other settings identical. I re-tested this game with both DX11 and DX12 (beta) in the standard build, and again with the press build that enabled DLSS 3.0 but the DLSS results without frame generation remained within a couple of frames of TAA (DLSS disabled) results. I suspect this has something to do with the build itself, which was set up for the full DLSS 3.0 experience, including frame generation.
Anyway, if I set DLSS to Performance and turned on Frame Generation the results were fantastic, taking this demanding title up to an impressive ~157 FPS average, which is more than double the performance I could get at 3440×1440 / Ultra without the feature enabled. And, as with Cyberpunk 2077, I could not discern any difference in quality, nor any evidence of frame interpolation, in real time. It’s a remarkably good technology. I’m curious to read what others think of it after they’ve tried it out.
Power, Thermals, Noise
Believe it or not, this RTX 4090 Founders Edition card is actually very well balanced. For a gaming load test I ran my usual 10x iterations of the Metro Exodus benchmark, this time at 3440×1440 / Extreme. We already know that this card is limited to 450 watts, and it did not exceed that (in this test it topped out at 436 W board power), and the entire system pulled 648 W at the wall.
Thermals were better than expected given the low noise, with a max recorded temps during the 10x iterations of the benchmark 67 C GPU, 76.8 C Hot Spot in a ~22 C room. These temps came with a maximum fan speed of 40% (~1320 RPM), which resulted in a reading of 37.7 dBA from my SPL meter, which was positioned exactly 12 inches from the front of the card. The fans do get quite a bit more noticeable if they need to spin up, reaching 46.4 dBA at 60%, 52.6 dBA at 80%, all the way up to 57.3 dBA in the event that you choose to manually run your card with 100% (2660 RPM) fans.
I will add that the fan curve clearly favors low noise, as I never saw above 40% during testing, and had to force the issue using Afterburner for the above SPL readings. At idle the fans don’t spin, and at the lowest setting I could manually set, 30% (1100 RPM), the card only registered 33.9 dBA. Considering the controlled thermals even at lower fan speeds, I think this is one of the better behaved reference cards I’ve tested.
Final Thoughts (for now)
If you consider this product within context, I think its fair to say that NVIDIA has hit a home run with the RTX 4090. They are bringing a massive performance uplift to the ultra high-end segment of their product stack, while asking $400 less than the list price of an RTX 3090 Ti. Will this fact make enthusiast gamers happy about a $1599 price tag? Well, what do you think?
Personally, I don’t have a problem with these halo products being expensive, as long as we reap the technology benefits in the lower-cost segment down the line. I think the real test is going to be how fast the RTX 4080 16GB is, but at $1199 that card will probably be a tough sell even if performance relative to the RTX 4090 matches the 25% lower asking price. Time will tell.
The GeForce RTX 4090 is an absolute beast. It looks and feels ultra high end. It performs like a supercar. Not all graphics cards can be a Honda Civic or Toyota Corolla. The RTX 4090 Founders Edition is like a McLaren. Beautiful, insanely fast, and out of most people’s price range. Ok, it’s not that expensive. Remember, it was just a couple of years ago that NVIDIA sold an RTX TITAN for $2499, and this card would mop the floor with it (I’m assuming, since I don’t have one here to test).
I can’t wait to test out the RTX 4090 with the Studio driver and see how fast I can render video and accelerate other tasks, given the raw horsepower of this GPU. It’s been a while since we’ve had new architecture to play with, and while a lot of the conversation leading up to this launch has been about power draw and the size of partner cards, the performance potential of this card cannot be understated. It’s a titan of a GPU, with the size and power draw to match. But I think people will find a way to integrate it into their systems anyway.
Bottom line, NVIDIA’s GeForce RTX 4090 Founders Edition is in a class by itself. It doesn’t matter which team you root for, or what games you play. It’s ridiculously fast, and we are only scratching the surface of its performance potential in this short review. For more on this product I highly recommend watching der8auer’s video on the subject (YT link), which includes a study in performance after lowering the card’s power limit (with surprising results).
This is what we consider the responsible disclosure of our review policies and procedures.
How Product Was Obtained
The product is on loan from NVIDIA for the purpose of this review.
What Happens To Product After Review
The product remains the property of NVIDIA but is on extended loan for future testing and product comparisons.
NVIDIA had no control over the content of the review and was not consulted prior to publication.
PC Perspective Compensation
Neither PC Perspective nor any of its staff were paid or compensated in any way by NVIDIA for this review.
NVIDIA has not purchased advertising at PC Perspective during the past twelve months.
If this article contains affiliate links to online retailers, PC Perspective may receive compensation for purchases through those links.
Great review, thanks!
Thanks for the nice review. Lovely to see 3440×1440 being tested.
Is there really that much of a difference between the founders edition and the 3rd party cards that will cost $2k?
Seems like a lot of marketing BS with no real world advantages over the FE.
looking like a couple of percent better performance, for at least a 6% price hike
Man, that truely is a beast. Want. Be interesting to see what the rest of the stack performs like.
It will be great to buy one when they’re available in 2 years!
Ever since the Nvidia gtx 10xx and AMD Radeon RX series cards the availability and price of the entry level cards has disappeared. Cards like the Nvidia 1060 ($250-300) and AMD RX 590 (similar price range) has completely disappeared. While those cards had to run at lower detail levels (1080P was the high end of cards on Steam) they were great for entry level systems to get people gaming. With RTX 3050 @ $399 and the RX 6600 XT both in $300-500 range there isn’t an entry level as anything below the 6600-XT does not give you the performance you need and the 3050 too much performance for a price the entry level can’t fit Nvidia and AMD are making the Budget Gaming PC no longer within budget.