Introduction, Driver Interface
Which inexpensive IGP or discrete GPU offers the best value?
There exist a particular group of gamers that are consumed by dreams of gigantic dual-SLI laptops that replace towering desktops. And who can blame them? Walking into a LAN party with a $5,000 laptop under your arm is the geek equivalent of entering a party wearing a $2,500 jacket or driving through your neighborhood in a $250,000 car. We can dream, right?
On the other hand, those super-powerful laptops are a bit…boring from a critic’s standpoint. Why? Because they are almost always excellent machines (due to price) and because most readers gandering at a review (of an expensive gaming laptop) I pen about will never buy one – again, due to the price.
Most folks – even many geeks – lust over a beefy gaming rig, but end up buying a $600 to $1000 multimedia laptop. This is the laptop that the average person can actually afford, regardless of his or her enthusiasm about computer hardware.
In the past, this market segment was a gaming wasteland, but that began to change about five years ago. The change was due in part to the fact that many game developers started to veer away from (a focus on) jaw-dropping graphics in favor of expanding their potential markets by going after clients with average/medium-range hardware.
About two and a half years ago Intel (again) committed to raising the bar on integrated graphics with the release of Intel HD and has since consistently improved its IGP offering with each new generation. AMD has done the same with its Fusion products and NVIDIA (already in the game with its numerous x10/x20/x30M products) just recommitted to power efficient GPUs with its Kepler architecture.
These changes mean that “serious” gaming is now possible on an inexpensive laptop. But how possible? What sacrifices do you make and how do low-end IGPs and GPUs stack up against each other?
To find out we are going to compare several different products in this space. We have a pair of Intel HD 4000 IGPs. One of the IGPs is from the ultrabook reference platform and the other is from the quad-core Ivy Bridge reference platform. There is also the Radeon HD 7660G from the AMD Trinity reference platform and the Nvidia GT 630M from the Ivy Bridge reference platform.
Finally, we’re going to throw in the Nvidia GT 640M from the Acer Aspire Timeline Ultra M3 (which we recently reviewed) as a representative of Nvidia’s new Kepler architecture and what you receive when you pay a bit more for a (discrete) mid-range mobile GPU.
We’re not just going to dive into the performance testing, however. Features like the driver utility, switchable graphics and multi-monitor support are important as well. Let’s address those aspects first.
The Driver Utility
AMD, Intel and Nvidia offer a driver utility that controls the hardware in each respective system. The basic purpose of each utility is the same, but the way that each company approaches the problem of simplifying somewhat complex driver controls is different.
AMD goes for an approach that groups features in various sub-menus that are organized on a sidebar. This approach seems simple at first, and it is certainly the most attractive. However, as you begin to use it, you’ll likely find that AMD’s driver utility is a bit of a pain to navigate. There is a lot of wasted space dedicated to slick-looking interface elements which results in a limited number of options per section.
In addition, it is not always obvious where a specific feature might be found in a menu, partially because the sidebars collapse by default, and partially because some of AMD’s terms for its features are confusing (I’m looking at you, HydraVision). I have a lot of experience with the Catalyst Control Center because I’ve used AMD cards in my desktop for the last three years. Despite this, I still find myself feeling a little lost from time to time.
NVIDIA’s solution looks less elegant. The control panel is a large window with simple text navigation that would look at home in Windows XP. All of the basic settings are lumped together under a few categories and most of them are found in either “image settings” or “3D settings.” There are just a few lines of text here and there to explain what the controls do, and many are not explained at all.
Despite this, Nvidia’s control panel is much easier to use over the long run. Nvidia has wisely realized that no amount of explanation or software design is going to make it easy for the layman to understand the difference between turning Trilinear Filtering on or off. Most people will simply ignore such settings or, if they are interested, turn to Google. Once you become acquainted with the features of Nvidia’s driver software, you can return to and adjust those features quickly.
And then we have Intel. Because there are fewer features to worry about, Intel’s drivers should be fairly simple. Yet the company has made its software needlessly complex. Opening the Intel driver software gives you an absurd choice – do you want to use basic mode, advanced mode or wizard mode? I’m serious. Intel has designed three different interfaces for controlling its graphics drivers.
Wizard mode doesn’t seem to do anything that Window’s display properties doesn’t, so I’m not sure why Intel bothered to include it. Basic mode, on the other hand, is just like advanced mode but with a few of the more complex features taken out. Most users will want to select advanced mode, check the “don’t show this dialog again” box and be done with it.
The good news is that advanced mode is pretty good. In fact, it’s a nice balance between AMD’s attempt to be user-friendly and Nvidia’s focus on functionality. The interface is attractive, navigation makes sense and the options are easy to use.
Overall, Nvidia’s software is the most functional for enthusiasts, while Intel’s is the easiest for the average user to understand. AMD is way behind. Although initially attractive, the Catalyst Control Center is confusing for all users.
Nice work, I was glad to see
Nice work, I was glad to see you go with higher LODs than we’ve been seeing in reviews. I hope this is something that you’ll update with more games and chips (like lesser Trinity models) when timeavailability permits.
how much slower is a sb
how much slower is a sb hd3000 compared to ib hd4000?
Well, I didn’t do testing
Well, I didn’t do testing that would provide a truly accurate figure, but I’d say 30 to 50% slower depending on the game. That’s compared to the Intel HD 4000 in the Core i7-3720QM, not the ultrabook version.
not that much.
I have a core
not that much.
I have a core i5 laptop with HD3000 and it plays Diablo 3 at 24+fps at low settings and resolution at 1200×720. Makes it very playable.
Have yet to try BF3 and Skyrim as I have those but they should perform similar (but slightly less) to the HD4000.
Interesting article. I’m in
Interesting article. I’m in the market for a new laptop and want it to be able to handle SC2 reasonably well, but still lean to the thin and portable side, though not necessarily an ultrabook. Glad to hear that Optimus is doing well and seamless as I hadn’t really stayed on top of it. Will any laptop that has an ivy/sandybridge CPU and a nvidia gpu have optimus or only if it specifically indicates so? That is, are there other components that need to be in place that might not be in place if not indicated?
I believe all of the
I believe all of the 600-series GPUs and most of the 500-series GPUs have Optimus. You should still check the laptop description to be sure, as it will almost always be listed as a feature.
3D Vision enabled laptops are the exception (there is apparently a difficulty with offering 3D Vision and Optimus together) but you’re probably not looking for that anyway.
Why test these budget
Why test these budget oriented notebooks on high/medium settings? They should’ve been tested on Low settings, so that we’d at least know if they can handle the games.
As it is, all I learned is that HD4000 can’t handle medium, ever.
I don’t find it interesting
I don’t find it interesting to discover if a laptop can run a game at settings that make the game look like ass.
I am not unsympathetic if you disagree. After all, money is not unlimited and maybe you want to know if a laptop can just barely play that game you have your eye on.
But that’s not the perspective I decided on for this article.
I applaud your decision to
I applaud your decision to try gaming at higher settings. For benchmarks at the lowest settings, try every other review on the web.
Now seeing the ASUS ultrabook
Now seeing the ASUS ultrabook with the GT 620M in it, I wished this also had that chip in it’s comparison – as I’m wondering if it is worth getting the model that has it. Oh well.
We have not reviewed a GT
We have not reviewed a GT 620M equipped laptop but we did review the Dell XPS 15z, which had the similar GT 525M. It could play a lot of games at 1366×768 and low/medium settings but that’s the limit for new titles. For example, it achieved about 25 FPS in BF3 at 1366×768, but went down to an unplayable 15 FPS at 1080p.
Thanks Matt.
Thanks Matt.
I was looking at getting an
I was looking at getting an Acer laptop with a 630m. Can anyone confirm if it can play Diablo 3 on Infero with at least LOW settings no lag?
I have a default Acer Aspire
I have a default Acer Aspire x3990 with windows 7, people sau that the Intel® HD Graphics wont run Dawn of War 2 even on low, other say I can. Do you believe it will run even if it has the lowest quality settings.