Real World Clock Speed and Temperature Comparison

Base clocks, boost clocks, real time clocks, grandfather clocks….what does it all mean?  Looking at the specifications it seems pretty straight forward: the PNY card will run the fastest at 1281 MHz Boost clock, followed by the EVGA at 1268 MHz and then the Galaxy at 1189 MHz.  But as we found over the past year or so, not all boost is the same, not all coolers are the same, and as a result not all performance metrics are the same.

Because of the way NVIDIA GPU Boost works, there can be variance from what the “typical” clock rate will be in its boost state based on the game being played, temperatures in the users case, etc.  NVIDIA has done a great job keeping all of these things in check and 100% of the time in my experience, the Boost clock was reached or exceed in every game we use for testing and benchmarking.

But how did EVGA, Galaxy and PNY decide to change things up with their custom cards?  Let’s a take look at a graph of actual recording clock speeds over time.

You are looking at about 500 seconds of recording clock speeds (using GPU-Z) after having looped Metro: Last Light at 1080p and Very High settings for about 10 minutes.  Four cards are represented here including the three retail models and the reference card.  What jumps out at me first is the solid yellow and green lines from the EVGA and Galaxy cards.  The yellow line is running at 1320 MHz but more importantly it is fixed, without any kind of variance.  This tells me that the GPU is being cooled exceptionally and also that EVGA was likely a bit more reserved in the clock rates they set on the FTW model of the 750 Ti.

The same can be said for Galaxy’s card running at a static 1267 MHz. 

PNY’s GTX 750 Ti XLR8 OC card is a slightly different story as the blue line moves around quite a bit, going from its peak of 1332 MHz to the 1250 MHz range in some instances.  Obviously this means that the GPU is much closer to its maximum temperature levels and the Boost technology is scaling frequency and voltage to keep things in line.  You can see that even NVIDIA’s reference card was doing similar clock scaling, though to a lesser degree.

What does it mean for performance?  If we look at the average frame rate over the entire 500+ seconds of time listed above, here is how the cards stack up.

As it turns out, even though the PNY card has base and boost clocks that are higher than the EVGA FTW, both options are actually running at the same average clock rate of 1320 MHz.  The Galaxy card rests at 1261 MHz or so but all three retail cards run much faster than the reference option at 1156 MHz.

This is just another example of stated specifications not telling the whole story for graphics cards.  This variance are nothing close to the issues we saw with the Radeon R9 290X and R9 290 launch, but they are still worth nothing and help define the scale of cooling and how it can affect performance for GPUs.

Do you want to see why these clocks vary?  Take a look at the temperature graph above and you’ll see that the EVGA GTX 750 Ti FTW was far and away the coolest of the bunch!  In our testing the Maxwell GPU didn’t cross above 45C while the other cards were hitting the 60C mark after extended testing.  Clearly the higher fan speeds of the EVGA card are keeping the GPU out of any kind of thermal danger, but at the cost of noise levels.

« PreviousNext »