In an effort to make this as consistent as possible – given that I am looking for potentially minor differences between system memory speeds – I have selected games with dedicated benchmarks for the sake of exact repeatability. These benchmarks were performed three times each, with the system restarted between each new run, and then the results were averaged.
This new release from Square-Enix is built on the Unreal engine, and Nixxes Software developed the PC version. I used the game’s built-in benchmark with image quality maxed out via the “Very High” preset, and 8x AA.
At 1920×1080 there was some very minor scaling with memory speed, but it was not really a factor. The results at 2560×1440 were completely non-linear, as I seemed to be hitting the limit of my GTX 770’s 2GB frame buffer at times during these higher resolution runs, causing some pretty choppy gameplay.
Batman: Arkham Origins
This is another game based on the Unreal engine, developed by Warner Montreal. The PC version of the game seems to be a console port, with some additional detail settings available. For these runs I set all detail options to their highest DX11 values, PhysX enabled and on “high”, and TXAA set to “high”.
Arkham Origins was very, very consistent with an almost exactly 60 FPS average at 1920×1080, regardless of the memory used (as if showing the roots of this console port on PC). Memory speed seemed to have little effect, other than very slightly improving the max frame rate – though the rest of the results were all over the map.
At 2560×1440 the result was unpredictable again, with the slowest memory speed actually producing the highest frame rates. Needless to say, memory speed didn't really help in any aspect of this game’s performance in this system.