Switchable Graphics, Multi-Monitor Support
Switchable Graphics
The idea of switchable graphics was just barely "a thing" two years ago. Now it is found in nearly all consumer laptops with discrete graphics. It’s one of those rare features that actually was a “game changer.” Battery life among discrete graphics laptops has risen from abysmal to excellent in just a couple of years thanks in part to this technology.
AMD has its own solutions for switchable graphics as does NVIDIA, but Intel – being an IGP – does not. Specifically, AMD simply calls its solution AMD Switchable Graphics Technology while Nvidia calls its solution Optimus.
The basics of each solution are the same when a discrete GPU is paired with Intel HD graphics. Unless its services are required the discrete GPU remains off, or at least effectively so – my experience testing the endurance of switchable graphics enabled laptops has indicated that the latest implementations from AMD and NVIDIA impose no detectable penalty on (battery) endurance when a laptop is not actively using them.
Where the two solutions diverge is driver support. I’ve encountered far more laptops with Optimus than with AMD Switchable Graphics, and my experience has lead me to think that it is for the best. Optimus, as it is implemented today, is wonderfully seamless. You can muck around with graphics settings if you want, but it’s not required. Everything just works without user input. You start a game and the GPU turns on. You leave the game and it turns off. Instantly, with no flicker and no prompts.
AMD’s drivers are more confusing because they often prompt the user for input. This is simply a re-hash of AMD’s general user interface issue (as mentioned previously). All of it seems as if it should be easy. The prompts are self-explanatory and there’s even a red/yellow/green color-code scheme that indicates what software does and doesn’t use the discrete GPU. But the constant nagging grows old and navigating the driver software requires more effort than it should.
The new Fusion products are an exception to this because they do not switch between a discrete GPU and an IGP. AMD Fusion laptops are using the IGP all of the time – unless they have a separate AMD discrete GPU which is connected via CrossFire. With no IGP to switch to, there’s no transition to make, and the result is just as seamless as what Nvidia offers with Optimus.
This is a non-trivial point. AMD’s new APUs make a decent case for themselves for the very budget conscious and brush aside some of AMD’s low-end discrete GPUs as a result. Buyers looking at an inexpensive multimedia laptop will actually receive a better user experience than those looking at a more expensive laptop with one of AMD’s discrete GPUs.
Multi-Monitor Support
AMD, Intel and Nvidia all support multiple monitors with their latest generation of graphics hardware.
Eyefinity can technically support up to six displays on the desktop but has always been limited on laptops. Currently, with Trinity, the maximum number of displays has been upped to 4. This is the current king of the hill, but you will likely have problems finding a laptop that has the proper outputs. (The upcoming MSI GX60 laptop will support up to three displays but that is not common).
NVIDIA gave me an interesting response when I asked them about display count. They stated that because the IGP is the default for almost everything besides games in a system using Optimus switchable graphics, laptops with NVIDIA Optimus chips will default to supporting whatever monitors the IGP can handle.
Which brings us to Intel HD 4000. It’s capable of handling up to three displays, which is an upgrade from Intel HD 3000’s two displays. One of the displays can be 2560×1600 but the others are restricted to 1920×1200.
Nice work, I was glad to see
Nice work, I was glad to see you go with higher LODs than we’ve been seeing in reviews. I hope this is something that you’ll update with more games and chips (like lesser Trinity models) when timeavailability permits.
how much slower is a sb
how much slower is a sb hd3000 compared to ib hd4000?
Well, I didn’t do testing
Well, I didn’t do testing that would provide a truly accurate figure, but I’d say 30 to 50% slower depending on the game. That’s compared to the Intel HD 4000 in the Core i7-3720QM, not the ultrabook version.
not that much.
I have a core
not that much.
I have a core i5 laptop with HD3000 and it plays Diablo 3 at 24+fps at low settings and resolution at 1200×720. Makes it very playable.
Have yet to try BF3 and Skyrim as I have those but they should perform similar (but slightly less) to the HD4000.
Interesting article. I’m in
Interesting article. I’m in the market for a new laptop and want it to be able to handle SC2 reasonably well, but still lean to the thin and portable side, though not necessarily an ultrabook. Glad to hear that Optimus is doing well and seamless as I hadn’t really stayed on top of it. Will any laptop that has an ivy/sandybridge CPU and a nvidia gpu have optimus or only if it specifically indicates so? That is, are there other components that need to be in place that might not be in place if not indicated?
I believe all of the
I believe all of the 600-series GPUs and most of the 500-series GPUs have Optimus. You should still check the laptop description to be sure, as it will almost always be listed as a feature.
3D Vision enabled laptops are the exception (there is apparently a difficulty with offering 3D Vision and Optimus together) but you’re probably not looking for that anyway.
Why test these budget
Why test these budget oriented notebooks on high/medium settings? They should’ve been tested on Low settings, so that we’d at least know if they can handle the games.
As it is, all I learned is that HD4000 can’t handle medium, ever.
I don’t find it interesting
I don’t find it interesting to discover if a laptop can run a game at settings that make the game look like ass.
I am not unsympathetic if you disagree. After all, money is not unlimited and maybe you want to know if a laptop can just barely play that game you have your eye on.
But that’s not the perspective I decided on for this article.
I applaud your decision to
I applaud your decision to try gaming at higher settings. For benchmarks at the lowest settings, try every other review on the web.
Now seeing the ASUS ultrabook
Now seeing the ASUS ultrabook with the GT 620M in it, I wished this also had that chip in it’s comparison – as I’m wondering if it is worth getting the model that has it. Oh well.
We have not reviewed a GT
We have not reviewed a GT 620M equipped laptop but we did review the Dell XPS 15z, which had the similar GT 525M. It could play a lot of games at 1366×768 and low/medium settings but that’s the limit for new titles. For example, it achieved about 25 FPS in BF3 at 1366×768, but went down to an unplayable 15 FPS at 1080p.
Thanks Matt.
Thanks Matt.
I was looking at getting an
I was looking at getting an Acer laptop with a 630m. Can anyone confirm if it can play Diablo 3 on Infero with at least LOW settings no lag?
I have a default Acer Aspire
I have a default Acer Aspire x3990 with windows 7, people sau that the Intel® HD Graphics wont run Dawn of War 2 even on low, other say I can. Do you believe it will run even if it has the lowest quality settings.