Using the GPU embedded in the vast majority of modern processors is a good way to reduce the price of and entry level system, as indeed is choosing Linux for your OS. Your performance is not going to match that of a system with a discrete GPU but with the newer GPU cores available you will be doing much better than the old days of the IGP. The first portion of Phoronix's review of the Skylake GPU covers the various versions of driver you can choose from while the rest compares Kaveri, Godavari, Haswell and Broadwell to the new HD530 on SkyLake CPUs. Currently the Iris Pro 6200 present on Broadwell is still the best for gaming, though the A10-7870K Godavari performance is also decent. Consider one of those two chips now, or await Iris Pro's possible arrival on a newer socketed processor if you are in no hurry.
"Intel's Core i5 6600K and i7 6700K processors released earlier this month feature HD Graphics 530 as the first Skylake graphics processor. Given that Intel's Open-Source Technology Center has been working on open-source Linux graphics driver support for over a year for Skylake, I've been quite excited to see how the Linux performance compares for Haswell and Broadwell as well as AMD's APUs on Linux."
Here are some more Processor articles from around the web:
- Intel Core i5 6600K Skylake Linux CPU Benchmarks @ Phoronix
- Intel Core i7-5775C Review @ Modders-Inc
- Intel Core i7-6700K Review: Inching Toward Extreme @ Modders-Inc
- Intel’s ‘Skylake’ Core i7-6700K: A Performance Look @ Techgage
- Intel Core i7 6700K "Skylake" Processor Review @HiTech Legion
- Intel Core i7-6700K Review @ Neoseeker
The DX12, and Vulkan graphics
The DX12, and Vulkan graphics API’s are going to be able to leverage at lot more of Both Nvidia’s and AMD’s GPU’s for more than just gaming workloads. Intel has always been known for only providing the GPU resources for gaming performance at the expense of the other types of graphics performance, and with the GPGPU/HSA type usage among even the open source software packages is starting to become commonplace where are the Intel GPGPU tests.
The YouTube video from Nvidia demonstrating the Vulkan performance improvements with CAD software and Vulkan was impressive for Vulkan running under windows, so hopefully there will be some more Vulkan running under Full Linux(Mint, etc.) other than the same video’s demonstration of Vulkan on Nvidia’s Tegra X1 tablet SKU(Android) which was impressive also. AMD Needs to get some Vulkan performance runs demonstrated for its new Carrizo APU based SKUs, and I’m sure their APUs will have the same types of improvements, with Vulkan shown to greatly improve all types of graphics application workloads.
Intel is still behind both AMD and Nvidia for overall GPU graphics workload usage on the GPU, and any Carrizo APU that AMD may decide to release at 14nm will definitely have the ability to cram more GPU ACEs onto an APU at that 14nm. I would expect that AMD may want to try any new process shrink out on a Carrizo APU just to obtain some engineering tests in advance of Zen’s release. That usage of the GPU style design libraries on Carrizo’s CPU core layout did save some extra die space at the 28nm process node, and AMD could maybe see how well using those GPU design libraries on an APU’s CPU cores at 14nm would do for future low power mobile only designs with even more space savings for more GPU cores.
Maybe a Carrizo 14nm refresh at 14nm with some extra GPU ACE units to test some scaling before the full Zen based product line can be released to market. Hopefully AMD’s K12 custom ARM SKUs will not be delayed too much past 2016-2017, as that Nvidia Tegra X1 was impressive even running the ARM reference cores, and not the Denver cores. I’d like to have a Linux(Steam OS) tablet running AMD’s a custom K12 ARM based APU.
Intel’s graphics performence comes with a high price, and its top graphics is only included on the most expensive SKUs. So the price to performence wins go to AMD(#1), and Nvidia(#2) if the Tegra X1 is included, and Intel being more costly Per/$$$ for graphics than even Nvidia.