Founders Edition and Overclocking

Though the name is the same, the GTX 1060 Founders Edition plays a slightly different role in this launch. With the GTX 1080 and GTX 1070, FE cards were used to get the cards to market ASAP, and will continue to exist throughout the life of the product line as an option for builders and enthusiasts that want an NVIDIA created and NVIDIA-backed option. With the GTX 1060, however, custom cards from board partners are going to be available on launch day, so the FE is more equivalent to a “reference” card than ever before. NVIDIA has even called this Founders Edition a “limited” card, so I expect it to be around for a short while and then eventually fade away to the world of system integrators. It will be interesting to see how many consumers actually prefer the reference design style and are willing to pay the premium for it.

The GTX 1060 Founders Edition has some unique traits. While the display output configuration is the now all-too-familiar combination of three DisplayPort, one HDMI and one DL DVI (which the RX 480 omitted), the PCB and cooler take an interesting form. The PCB is basically identical in size to that of the RX 480 at 6.75-in (171.5mm) long. The blower style cooler extends past the PCB by another 3-in (76mm), making the Founders Edition 9.75-in (247.6mm) long.

The 6-pin power connection on the GTX 1060 Founders Edition seems oddly placed – rather than attached to the PCB directly, the 6-pin is put at the end of the cooler, meaning a cable exists inside the cooler to bring juice to the card itself. The reason for this is appearance; the card looks more balanced and better in a windowed case with the 6-pin connection at the far end of the card rather than in the middle of it. It’s an interesting trade-off though, one that will make aftermarket coolers a bit more complex.

One interesting spot we see is the obviously missing or removed item from the back of the cooler. Three screw holes and a dip in the extruded metal suggest that something was planned for this spot or was on there, but removed after production for some reason.

NVIDIA changed up the shroud on the cooler to help lower costs. The “window” area on the classic GeForce design is now just a black painted area and the design is tweaked with slightly fewer, shallower polygonal angles. I’m still a fan of the design though and I think NVIDIA’s construction and build quality just “feel” better in the hand than the RX 480. Whether that matters to anyone installing this card into a gaming PC rather than putting it on a shelf is up for debate.

One thing that is missing from the GeForce GTX 1060 card? An SLI bridge connection. There are none and the reason is simple: NVIDIA tells us that SLI is not going to be supported on the GeForce GTX 1060. Rumors have swirled since pictures first leaked that this meant NVIDIA was moving to a PCI Express based data transfer technology for the GTX 1060, similar to what AMD does with CrossFire on its entire lineup. That’s not the case, and would be crazy after the big push for a new SLI Bridge that NVIDIA made with the GTX 1080 launch. The GTX 1060, and we assume any future cards in this class, are not going to support multi-GPU technology.

The decision is kind of astounding to me, really. NVIDIA launched Pascal pushing 2-GPU SLI strongly and eventually ended up cutting out all higher count SLI configurations completely, in order to preserve the consumer experience of 2-Way SLI. Cutting out GTX 1060 owners from SLI because “that market doesn’t really utilize SLI” is just an excuse, not a reason. There is no substantial cost benefit to cutting validation testing for the GTX 1060 if you are continuing to run it for GTX 1080 and GTX 1070. There are plenty of consumers that love the idea of buying a ~$250 graphics card today and adding another down the line, potentially to scale to the performance of one of NVIDIA’s larger, more expensive graphics card. Even worse, you can actually see indentations and spacing on the PCB where SLI connections would have been inserted!

Consumers interested in the GTX 1060 Founders Edition will only find it for sale from NVIDIA directly, on its website.

Clock speeds and overclocking with the GeForce GTX 1060

One of the first things I like to do with a new graphics card is to find its baseline clock speed. Since the adoption of GPU Boost and with the inclusion of a similar technology from AMD with Polaris, clock speeds are going to vary based on system properties, content being played and cooling. For consistent testing I setup the new GTX 1060 FE in my GPU testbed and ran through Unigine Heaven v4.0 for 10 minutes, recording the clock speeds with GPU-Z 1.9.0.

Temperatures with the blower style design crept up to 73C after an extended session with clock speeds averaging at just under 1840 MHz. That’s well above the 1708 MHz boost clock rating from NVIDIA and is 22% higher than the rated base cock of 1506 MHz!

Using a new version of Precision X, I was able to overclock the GTX 1060 Founders Edition with a +200 MHz offset. That’s an impressive overclock for our first attempt and literally took just a matter of 10 minutes to find using the standard process of overclocking and stability checking.

The result is a GPU frequency that exceeded my expectations! Once overclocked, the GPU temperature made it to 79C while the GPU was running at nearly 2050 MHz. Even better is that lack of variability in these clock speeds which helps the system maintain a more stable and fluid frame rate for games. Compare that to the result I found in my RX 480 review – there is a dramatic difference in implementation between the two architectures.

Though overclocking on the GeForce GTX 1060 has just begun, the results are incredibly compelling. Out of the box, our FE sample was hitting 1838 MHz in Unigine Heaven, stable, and with an offset of 200 MHz was able to breach the 2.0 GHz barrier without much work on my part.

With three retail / custom cards in our hands, I am going to be very interested to see what kind of changes in overclocking, power draw and stability I am able see from unit to unit. My guess is that changes will be minimal, and the real variance will be to noise and temperature levels.

« PreviousNext »