Medium – Skyrim
The Elder Scrolls V: Skyrim (DirectX 9)
The Empire of Tamriel is on the edge. The High King of Skyrim has been murdered.
Alliances form as claims to the throne are made. In the midst of this conflict, a far more dangerous, ancient evil is awakened. Dragons, long lost to the passages of the Elder Scrolls, have returned to Tamriel.
The future of Skyrim, even the Empire itself, hangs in the balance as they wait for the prophesized Dragonborn to come; a hero born with the power of The Voice, and the only one who can stand amongst the dragons.
Actual settings for our testing today were 1920×1080, Low Preset
Here is a video our testing run through, for your reference
Skyrim can be known as heavily CPU dependent but running at 1920×1080 at this preset the results still scale very well with different GPU configurations. Once again we see that none of the IGP solutions are showing runts or dropped frames, so the FRAPs results and Frame Rating results are fairly close. Richland dominates the performance results here with an average frame rate of about 57 FPS, followed by the Trinity notebook at 38 FPS, MBA / HD 5000 at 32 FPS and then the GE40 / HD 4600 at 28 FPS.
All four configurations have decent frame times in our testing with variance that stays below the 4 ms mark until the 95th percentile or so (on Trinity). Trinity definitely has the hardest time keeping a consistent frame rate in Skyrim as you can see in the Frame Times graph with the black bar – it is consistently wider than the other competitors in my testing. This is one case though that even the MacBook Air’s dual-core processor with the HD 5000 are able to get a faster average frame rate than the quad-core Core i7 + HD 4600 while also maintaining a lower frame time variance (for the most part) as well. This means that the MBA can play Skyrim faster than and just as smooth as the Core i7-4702MQ + HD 4600.
Good reviews Ryan and as you
Good reviews Ryan and as you say in the (Performance per Watt and Final Thoughts) section : (Clearly we need to get in a Richland based notebook to see where it stands for the mobile market).
I couldn’t agree more will be interesting to see them results.
Good work.
Nice article Ryan, quick
Nice article Ryan, quick question though; On the Methodology page should the following sentence have the word “NOT” in it where I inserted it (capitalized for ease in finding it)?
“It might not stand out initially, but testing on the integrated panels was NOT possible with this testing; in order to get a capture of the graphical output from the system we need to intercept the signal from the GPU to the display and record it.”
Yup!
Yup!
With 2 out of the 3 laptops,
With 2 out of the 3 laptops, that I own, never get Intel HD graphics Driver updates from the OEM’s [who customizied the Intel HD graphics drivers, then never update the graphics drivers!]. Intel lets the OEMs customize their Intel HD graphics drivers, Once these Intel HD Graphics drivers are customizied by the OEM, Intel can not update them, it becomes the OEM’s responcability to update the OEM modded Intel HD graphics drivers! 2 of the 3 laptops that I own will never get updated graphics drivers. The big question here is What good is Intel graphics without updates? Intel’s record here, thorugh letting laptpop OEM customizie the Intel HD graphics, and OEMs never updating their customizied Intel HD graphics drivers, is Piss Poor! Intel needs to make the laptop OEMs use Intel generic HD graphics drivers, which can be updated by Intel, or Intel needs to require laptop OEM’s to keep the OEM customizied Intel HD graphics drivers updated on a regular basis! Without proper graphics drivers, and graphics driver update support, Intel graphics can not be trusted for gaming, no matter how good Intel’s graphics hardware is!
“And to be completely fair to
“And to be completely fair to Intel, I think that it was able to hold its own in terms of performance per watt of TDP.” –HAHAHA as if ANYONE needs to be fair to INTEL!!!
So ryan, what’s better for a
So ryan, what’s better for a budget game box?
A10-6800k vs a low end dedicated GPU such as GTX750 for 19×10 gaming?
Well the 750 will be faster,
Well the 750 will be faster, but it's a bigger configuration and will use more power / create more heat.
For gaming I would do hybrid
For gaming I would do hybrid crossfire by adding a low to mid range card.
I think if you are going to
I think if you are going to put up a graph on performance/watt then you should put up a performance/price graph as well. These two graphs are the main points of any hardware comparison and go hand in hand with each other. You can’t have one without the other. All your hardware reviews should have performance/watt and performance/price graphs included in them. Those are the best scales to use when comparing hardware. I usually have to make my own graphs because most don’t do that and when they do its only with a couple pieces of hardware. Another suggestion would be to include older hardware in benchmark comparisons because most who are buying new components are upgrading from something older. There is no point really not to include older hardware because the end-user can’t tell how much faster the new card is over theirs. I think the best comparisons for benchmarks should include one from each team of a high-end, mainstream and low-cost part from the current components back at least 2 generations. There are so many benchmarks of hardware of the same generation and there are plenty of places to find it but what is hard to find is how a new component is going to perform over someone’s old component. Just a thought.
@ryan:I did not follow the
@ryan:I did not follow the whole series of article,my question.
I believe one angle wasn’t covered that might affect result.pre-msi,post-msi and latest msi-x (message signal interrupt)
From what I read msi-x or newest possible is recommended.but I can’t find one mono that does not hybridize .(use pre-msi and msi-x or MSI.could you suggest a mono that do 100% msi-x.if your bored it would be a nice article to do.since it should definitely alter most result you had so far (if you didn’t think about interrupt!)
Not mono but motherboard
Not mono but motherboard