Introduction, Driver Interface
Which inexpensive IGP or discrete GPU offers the best value?
There exist a particular group of gamers that are consumed by dreams of gigantic dual-SLI laptops that replace towering desktops. And who can blame them? Walking into a LAN party with a $5,000 laptop under your arm is the geek equivalent of entering a party wearing a $2,500 jacket or driving through your neighborhood in a $250,000 car. We can dream, right?
On the other hand, those super-powerful laptops are a bit…boring from a critic’s standpoint. Why? Because they are almost always excellent machines (due to price) and because most readers gandering at a review (of an expensive gaming laptop) I pen about will never buy one – again, due to the price.
Most folks – even many geeks – lust over a beefy gaming rig, but end up buying a $600 to $1000 multimedia laptop. This is the laptop that the average person can actually afford, regardless of his or her enthusiasm about computer hardware.
In the past, this market segment was a gaming wasteland, but that began to change about five years ago. The change was due in part to the fact that many game developers started to veer away from (a focus on) jaw-dropping graphics in favor of expanding their potential markets by going after clients with average/medium-range hardware.
About two and a half years ago Intel (again) committed to raising the bar on integrated graphics with the release of Intel HD and has since consistently improved its IGP offering with each new generation. AMD has done the same with its Fusion products and NVIDIA (already in the game with its numerous x10/x20/x30M products) just recommitted to power efficient GPUs with its Kepler architecture.
These changes mean that “serious” gaming is now possible on an inexpensive laptop. But how possible? What sacrifices do you make and how do low-end IGPs and GPUs stack up against each other?
To find out we are going to compare several different products in this space. We have a pair of Intel HD 4000 IGPs. One of the IGPs is from the ultrabook reference platform and the other is from the quad-core Ivy Bridge reference platform. There is also the Radeon HD 7660G from the AMD Trinity reference platform and the Nvidia GT 630M from the Ivy Bridge reference platform.
Finally, we’re going to throw in the Nvidia GT 640M from the Acer Aspire Timeline Ultra M3 (which we recently reviewed) as a representative of Nvidia’s new Kepler architecture and what you receive when you pay a bit more for a (discrete) mid-range mobile GPU.
We’re not just going to dive into the performance testing, however. Features like the driver utility, switchable graphics and multi-monitor support are important as well. Let’s address those aspects first.
The Driver Utility
AMD, Intel and Nvidia offer a driver utility that controls the hardware in each respective system. The basic purpose of each utility is the same, but the way that each company approaches the problem of simplifying somewhat complex driver controls is different.
AMD goes for an approach that groups features in various sub-menus that are organized on a sidebar. This approach seems simple at first, and it is certainly the most attractive. However, as you begin to use it, you’ll likely find that AMD’s driver utility is a bit of a pain to navigate. There is a lot of wasted space dedicated to slick-looking interface elements which results in a limited number of options per section.
In addition, it is not always obvious where a specific feature might be found in a menu, partially because the sidebars collapse by default, and partially because some of AMD’s terms for its features are confusing (I’m looking at you, HydraVision). I have a lot of experience with the Catalyst Control Center because I’ve used AMD cards in my desktop for the last three years. Despite this, I still find myself feeling a little lost from time to time.
NVIDIA’s solution looks less elegant. The control panel is a large window with simple text navigation that would look at home in Windows XP. All of the basic settings are lumped together under a few categories and most of them are found in either “image settings” or “3D settings.” There are just a few lines of text here and there to explain what the controls do, and many are not explained at all.
Despite this, Nvidia’s control panel is much easier to use over the long run. Nvidia has wisely realized that no amount of explanation or software design is going to make it easy for the layman to understand the difference between turning Trilinear Filtering on or off. Most people will simply ignore such settings or, if they are interested, turn to Google. Once you become acquainted with the features of Nvidia’s driver software, you can return to and adjust those features quickly.
And then we have Intel. Because there are fewer features to worry about, Intel’s drivers should be fairly simple. Yet the company has made its software needlessly complex. Opening the Intel driver software gives you an absurd choice – do you want to use basic mode, advanced mode or wizard mode? I’m serious. Intel has designed three different interfaces for controlling its graphics drivers.
Wizard mode doesn’t seem to do anything that Window’s display properties doesn’t, so I’m not sure why Intel bothered to include it. Basic mode, on the other hand, is just like advanced mode but with a few of the more complex features taken out. Most users will want to select advanced mode, check the “don’t show this dialog again” box and be done with it.
The good news is that advanced mode is pretty good. In fact, it’s a nice balance between AMD’s attempt to be user-friendly and Nvidia’s focus on functionality. The interface is attractive, navigation makes sense and the options are easy to use.
Overall, Nvidia’s software is the most functional for enthusiasts, while Intel’s is the easiest for the average user to understand. AMD is way behind. Although initially attractive, the Catalyst Control Center is confusing for all users.