Light – Left 4 Dead 2
Settings for Left 4 Dead 2 were 1920×1080 + Very High Settings, No AA
The first thing to note is that there does not appear to be a big difference between the FRAPS based results in our IGP testing and the results found in our Frame Rating system – the frame times are consistent and pretty reliable across the systems as you would expect with only single GPUs at work. The performance of these platforms is interesting in L4D2 though starting with Richland – it takes a sizeable performance lead over the rest of the pack, hitting around 65 FPS on average and also getting the least variant time frame result.
The Trinity system came in second here with an average frame rate around 40 FPS followed by the MacBook Air with the HD 5000 graphics at 38 FPS and trailed by the MSI GE40 with HD 4600 at 31 FPS. But take a look at the Frame Times graph and compare the lines for the green, black and pink lines. The pink, representing the Core i7-4702MQ + HD 4600 combination is definitely has the tightest frame times resulting in the second most reliable frame rates (after the Richland system). Both the green and black lines of Trinity and the MBA (HD 5000) had some spikes in frame times that are indicative of stutters in the gameplay. Above the 90th percentile both of those systems were exhibiting more than 4ms of frame variance meaning you will likely see the animation smoothness affected.
This tells me that even though the average frame rate of the MBA is nearly as high as the AMD A10-4600M, both share a lot of frame time issues that are probably the result of slower CPUs in each configuration. And, though the HD 4600 configuration is slower than the MBA with the HD 5000 IGP, it results in a smoother experience just at a lower average frame rate.
Good reviews Ryan and as you
Good reviews Ryan and as you say in the (Performance per Watt and Final Thoughts) section : (Clearly we need to get in a Richland based notebook to see where it stands for the mobile market).
I couldn’t agree more will be interesting to see them results.
Good work.
Nice article Ryan, quick
Nice article Ryan, quick question though; On the Methodology page should the following sentence have the word “NOT” in it where I inserted it (capitalized for ease in finding it)?
“It might not stand out initially, but testing on the integrated panels was NOT possible with this testing; in order to get a capture of the graphical output from the system we need to intercept the signal from the GPU to the display and record it.”
Yup!
Yup!
With 2 out of the 3 laptops,
With 2 out of the 3 laptops, that I own, never get Intel HD graphics Driver updates from the OEM’s [who customizied the Intel HD graphics drivers, then never update the graphics drivers!]. Intel lets the OEMs customize their Intel HD graphics drivers, Once these Intel HD Graphics drivers are customizied by the OEM, Intel can not update them, it becomes the OEM’s responcability to update the OEM modded Intel HD graphics drivers! 2 of the 3 laptops that I own will never get updated graphics drivers. The big question here is What good is Intel graphics without updates? Intel’s record here, thorugh letting laptpop OEM customizie the Intel HD graphics, and OEMs never updating their customizied Intel HD graphics drivers, is Piss Poor! Intel needs to make the laptop OEMs use Intel generic HD graphics drivers, which can be updated by Intel, or Intel needs to require laptop OEM’s to keep the OEM customizied Intel HD graphics drivers updated on a regular basis! Without proper graphics drivers, and graphics driver update support, Intel graphics can not be trusted for gaming, no matter how good Intel’s graphics hardware is!
“And to be completely fair to
“And to be completely fair to Intel, I think that it was able to hold its own in terms of performance per watt of TDP.” –HAHAHA as if ANYONE needs to be fair to INTEL!!!
So ryan, what’s better for a
So ryan, what’s better for a budget game box?
A10-6800k vs a low end dedicated GPU such as GTX750 for 19×10 gaming?
Well the 750 will be faster,
Well the 750 will be faster, but it's a bigger configuration and will use more power / create more heat.
For gaming I would do hybrid
For gaming I would do hybrid crossfire by adding a low to mid range card.
I think if you are going to
I think if you are going to put up a graph on performance/watt then you should put up a performance/price graph as well. These two graphs are the main points of any hardware comparison and go hand in hand with each other. You can’t have one without the other. All your hardware reviews should have performance/watt and performance/price graphs included in them. Those are the best scales to use when comparing hardware. I usually have to make my own graphs because most don’t do that and when they do its only with a couple pieces of hardware. Another suggestion would be to include older hardware in benchmark comparisons because most who are buying new components are upgrading from something older. There is no point really not to include older hardware because the end-user can’t tell how much faster the new card is over theirs. I think the best comparisons for benchmarks should include one from each team of a high-end, mainstream and low-cost part from the current components back at least 2 generations. There are so many benchmarks of hardware of the same generation and there are plenty of places to find it but what is hard to find is how a new component is going to perform over someone’s old component. Just a thought.
@ryan:I did not follow the
@ryan:I did not follow the whole series of article,my question.
I believe one angle wasn’t covered that might affect result.pre-msi,post-msi and latest msi-x (message signal interrupt)
From what I read msi-x or newest possible is recommended.but I can’t find one mono that does not hybridize .(use pre-msi and msi-x or MSI.could you suggest a mono that do 100% msi-x.if your bored it would be a nice article to do.since it should definitely alter most result you had so far (if you didn’t think about interrupt!)
Not mono but motherboard
Not mono but motherboard