Heavy – Battlefield 3
Battlefield 3 (DirectX 11)
Battlefield 3™ leaps ahead of its time with the power of Frostbite 2, DICE's new cutting-edge game engine. This state-of-the-art technology is the foundation on which Battlefield 3 is built, delivering enhanced visual quality, a grand sense of scale, massive destruction, dynamic audio and character animation utilizing ANT technology as seen in the latest EA SPORTS™ games.
Frostbite 2 now enables deferred shading, dynamic global illumination and new streaming architecture. Sounds like tech talk? Play the game and experience the difference!
Actual settings for our testing today are 1680×1050, Low Preset
Here is our testing run through the game, for your reference.
BF3, even at the Low preset, is definitely a stretch for these mobility platforms but it is worth looking at the results to judge the performance of these integrated solutions. The Richland desktop APU was able to average 27 FPS in our run through while all the rest of the system ran under the 20 FPS mark. All of the frame times are consistent in our testing with a heavy load placed on the GPU though all the notebooks crossed the 4 ms mark at the 90th percentile.
Though the numbers are low, the Trinity APU has the best performance, hitting 17 FPS and staying just ahead of the Core i7-4702MQ + HD 4600 and Core i5-4250U + HD 5000 combinations.
Good reviews Ryan and as you
Good reviews Ryan and as you say in the (Performance per Watt and Final Thoughts) section : (Clearly we need to get in a Richland based notebook to see where it stands for the mobile market).
I couldn’t agree more will be interesting to see them results.
Good work.
Nice article Ryan, quick
Nice article Ryan, quick question though; On the Methodology page should the following sentence have the word “NOT” in it where I inserted it (capitalized for ease in finding it)?
“It might not stand out initially, but testing on the integrated panels was NOT possible with this testing; in order to get a capture of the graphical output from the system we need to intercept the signal from the GPU to the display and record it.”
Yup!
Yup!
With 2 out of the 3 laptops,
With 2 out of the 3 laptops, that I own, never get Intel HD graphics Driver updates from the OEM’s [who customizied the Intel HD graphics drivers, then never update the graphics drivers!]. Intel lets the OEMs customize their Intel HD graphics drivers, Once these Intel HD Graphics drivers are customizied by the OEM, Intel can not update them, it becomes the OEM’s responcability to update the OEM modded Intel HD graphics drivers! 2 of the 3 laptops that I own will never get updated graphics drivers. The big question here is What good is Intel graphics without updates? Intel’s record here, thorugh letting laptpop OEM customizie the Intel HD graphics, and OEMs never updating their customizied Intel HD graphics drivers, is Piss Poor! Intel needs to make the laptop OEMs use Intel generic HD graphics drivers, which can be updated by Intel, or Intel needs to require laptop OEM’s to keep the OEM customizied Intel HD graphics drivers updated on a regular basis! Without proper graphics drivers, and graphics driver update support, Intel graphics can not be trusted for gaming, no matter how good Intel’s graphics hardware is!
“And to be completely fair to
“And to be completely fair to Intel, I think that it was able to hold its own in terms of performance per watt of TDP.” –HAHAHA as if ANYONE needs to be fair to INTEL!!!
So ryan, what’s better for a
So ryan, what’s better for a budget game box?
A10-6800k vs a low end dedicated GPU such as GTX750 for 19×10 gaming?
Well the 750 will be faster,
Well the 750 will be faster, but it's a bigger configuration and will use more power / create more heat.
For gaming I would do hybrid
For gaming I would do hybrid crossfire by adding a low to mid range card.
I think if you are going to
I think if you are going to put up a graph on performance/watt then you should put up a performance/price graph as well. These two graphs are the main points of any hardware comparison and go hand in hand with each other. You can’t have one without the other. All your hardware reviews should have performance/watt and performance/price graphs included in them. Those are the best scales to use when comparing hardware. I usually have to make my own graphs because most don’t do that and when they do its only with a couple pieces of hardware. Another suggestion would be to include older hardware in benchmark comparisons because most who are buying new components are upgrading from something older. There is no point really not to include older hardware because the end-user can’t tell how much faster the new card is over theirs. I think the best comparisons for benchmarks should include one from each team of a high-end, mainstream and low-cost part from the current components back at least 2 generations. There are so many benchmarks of hardware of the same generation and there are plenty of places to find it but what is hard to find is how a new component is going to perform over someone’s old component. Just a thought.
@ryan:I did not follow the
@ryan:I did not follow the whole series of article,my question.
I believe one angle wasn’t covered that might affect result.pre-msi,post-msi and latest msi-x (message signal interrupt)
From what I read msi-x or newest possible is recommended.but I can’t find one mono that does not hybridize .(use pre-msi and msi-x or MSI.could you suggest a mono that do 100% msi-x.if your bored it would be a nice article to do.since it should definitely alter most result you had so far (if you didn’t think about interrupt!)
Not mono but motherboard
Not mono but motherboard