Hybrid CrossFire that actually works
We got a hold of a new driver that enables dual graphics support (hybrid CrossFire) for the new Kaveri APU. Has frame pacing been fixed?
The road to redemption for AMD and its driver team has been a tough one. Since we first started to reveal the significant issues with AMD's CrossFire technology back in January of 2013 the Catalyst driver team has been hard at work on a fix, though I will freely admit it took longer to convince them that the issue was real than I would have liked. We saw the first steps of the fix released in August of 2013 with the release of the Catalyst 13.8 beta driver. It supported DX11 and DX10 games and resolutions of 2560×1600 and under (no Eyefinity support) but was obviously still less than perfect.
In October with the release of AMD's latest Hawaii GPU the company took another step by reorganizing the internal architecture of CrossFire on the chip level with XDMA. The result was frame pacing that worked on the R9 290X and R9 290 in all resolutions, including Eyefinity, though still left out older DX9 titles.
One thing that had not been addressed, at least not until today, was the issues that surrounded AMD's Hybrid CrossFire technology, now known as Dual Graphics. This is the ability for an AMD APU with integrated Radeon graphics to pair with a low cost discrete GPU to improve graphics performance and gaming experiences. Recently over at Tom's Hardware they discovered that Dual Graphics suffered from the exact same scaling issues as standard CrossFire; frame rates in FRAPS looked good but the actually perceived frame rate was much lower.
A little while ago a new driver made its way into my hands under the name of Catalyst 13.35 Beta X, a driver that promised to enable Dual Graphics frame pacing with Kaveri and R7 graphics cards. As you'll see in the coming pages, the fix definitely is working. And, as I learned after doing some more probing, the 13.35 driver is actually a much more important release than it at first seemed. Not only is Kaveri-based Dual Graphics frame pacing enabled, but Richland and Trinity are included as well. And even better, this driver will apparently fix resolutions higher than 2560×1600 in desktop graphics as well – something you can be sure we are checking on this week!
Just as we saw with the first implementation of Frame Pacing in the Catalyst Control Center, with the 13.35 Beta we are using today you'll find a new set of options in the Gaming section to enable or disable Frame Pacing. The default setting is On; which makes me smile inside every time I see it.
The hardware we are using is the same basic setup we used in my initial review of the AMD Kaveri A8-7600 APU review. That includes the A8-7600 APU, an Asrock A88X mini-ITX motherboard, 16GB of DDR3 2133 MHz memory and a Samsung 840 Pro SSD. Of course for our testing this time we needed a discrete card to enable Dual Graphics and we chose the MSI R7 250 OC Edition with 2GB of DDR3 memory. This card will run you an additional $89 or so on Amazon.com. You could use either the DDR3 or GDDR5 versions of the R7 250 as well as the R7 240, but in our talks with AMD they seemed to think the R7 250 DDR3 was the sweet spot for the CrossFire implementation.
Both the R7 250 and the A8-7600 actually share the same number of SIMD units at 384, otherwise known as 384 shader processors or 6 Compute Units based on the new nomenclature that AMD is creating. However, the MSI card is clocked at 1100 MHz while the GPU portions of the A8-7600 APU are running at only 720 MHz.
So the question is, has AMD truly fixed the issues with frame pacing with Dual Graphics configurations, once again making the budget gamer feature something worth recommending? Let's find out!
Sorry to say Ryan, but this
Sorry to say Ryan, but this fix came to little to late, for me to stay with AMD/ATI this Frame Pacing fix should have been in Catalyst 13.12. I’m still having the infamous frame pacing trouble with Skyrim, and odd flickering in my games with a pair of 2GB Radeon 7850’s and this includes Diablo 3, which is widely known on the Diablo 3 forums to bring down many a high end system. It has reached a point, where I’m jumping ship and getting a pair of 2GB GeForce GTX760’s and I’m picking up at the same time a cheap GT610 for Physx processing.
AMD also needs to get back in the game of putting out WHQL drivers on a monthly or bimonthly, non of this quarterly garbage that they are doing, all putting out drivers quarterly does is hurts them in the long run.
The GT610 will bottleneck
The GT610 will bottleneck 760s in SLi if used as a PhysX card. It’s weak for PhysX processing.
You are so awesome! I don’t
You are so awesome! I don’t suppose I’ve read through something like that
before. So nice to find another person with genuine thoughts on this subject.
Really.. many thanks for starting this up. This site is one thing that’s needed on the internet,
someone with a bit of originality!
My web page … momloving
Let your processor handle
Let your processor handle PhysX, otherwise you’re just unnecessarily dividing your PCI-E lanes, unless you have a -E SKU with 40 available lanes.
Nope, my ASUS Z87 Deluxe
Nope, my ASUS Z87 Deluxe motherboard has a PCI Express bridge chip on it, that provides addtional PCI Express lanes to the PCI Express X1 slots and the last PCI Express X16 slot. This is why I can get away with adding in a card such as a GT610 for Physx processing. The PCI Express bridge chip is one of the reasons I chose the ASUS Z87 Deluxe motherboard when I did my upgrade in August of 2013.
Most Z87 motherboards do not have a PCI Express bridge chip on it, to provide those addtional PCI Express lanes/bandwidth. I’ve personally looked at the various Z87 chipset boards, and most of them do indeed do not have a PCI Express bridge chip to provide extra PCI Express lanes/bandwidth on them, otherwise the only way to get more PCI Express bandwidth is to go with an E series chip or an X79 board and cpu.
I can PROMISE you that you
I can PROMISE you that you are wasting money on a GT610 as a PPU. IF you must buy one get at least a GT640. But you’re not going to see any benefit, I’ve run the tests multiple times and very few times does it do you any good.
In fact a GT 610 is a PhysX
In fact a GT 610 is a PhysX “decelerator”.
I test various configs with a second nvidia card for PhysX, and for graphic cards like a GTX 670 (one) you need at least a GT 640 GDDR5. For two cards (and more raw processing) you need GTX 650 Ti or better card.
As example, with a GTX 560 Ti for 3D, if you have a second one for PhysX like a 8800GTS/GT, you have worse framerates than with the GTX 560 Ti alone, for 3D and PhysX.
Рlenty оf otҺer adventure-
Рlenty оf otҺer adventure- the interеst of Goodegame Empijre has its personalized
relevance and that iss why nevertheless numerous people
aƿpreciate to execute it, although construϲting games are also offered on line.
L’univers p GoodGame Empiгe est en changement régulier
les mises à jour du jeu et caսses est auu rеndezѵous , beaucoup
dе nouveautés hebdomadaires permettent de encore furthermore
de-vie ce bіen communautaire!
Havе a look at my web page :: virtual-hack.fr
Jumping ship just before
Jumping ship just before seeing what Mantle can do is something that probably you will regret later. Also as others said if you buy two 760’s you don’t need a separate card for PhysX. If you do buy a third card think 640 as a minimum.
Sorry to say “Nightowl”, but
Sorry to say “Nightowl”, but the truth is that Skyrim is OLD. Directx 9 is OLD, and Crossfire is not really required for a DX9 title…so you could play that with one card only. You should look forward, not backward 😉
and about WHQL, no one should feel this need…’cause the process to certify the drivers is too long and neither AMD nor we can waste so much time… beta drivers are often stable and are more than enough
It’s sad to see that these
It’s sad to see that these APUs still can’t run BF3 decently.
I was very disappointed when my A8 3870K wouldn’t run BF3 at anything more than like 1024×768 on low
Sorry but you have no idea
Sorry but you have no idea how to adjust the settings/ resolution, in some game depending on your graphics card raw performance;)
– AMD LIano APU A4-3400 in Battelfield 3
http://youtu.be/qO1wg2jgGos
Well you’ve got my hopes up
Well you’ve got my hopes up on crossfire with eyefinity. Second 7970 been disconnected for months now would like to be able to make use of it as per as advertised rather than as a dead weight or space heater.
I’m sorry, but Frame Pacing
I’m sorry, but Frame Pacing is not fixed. Shame on AMD for trying to weasel out of supporting DX9. The age of DX9 doesn’t matter. There are still plenty of relevant and fun games that exist today using DX9 only.
The complete bullshit claim that Crossfire isn’t needed for a DX9 title, is just that.
Is that so crazy that peoples favorite games are still running DX9? Is that so crazy that people want to use more than one GPU with a DX9 title? Maybe people don’t want part of their investment being useless, or hindering their experience, while playing their favorite game.
So if your favorite game is still running DX9, and you plan on buying another GPU someday. AMD just kicked you over to NVIDIA, and they are happy to accommodate you.
My favorite game, that i play every day. Is using DX9, they have not hinted at updating the API anytime in the near future. What am i supposed to do about that?
There is a sneaky way to make
There is a sneaky way to make things work in DirectX 9 and an APU with dual graphics…
I’ve been messing about with various settings and have discovered how to get the most out of your dual-graphics with Directx 9 (and previous games).
First you must create a profile in Catalyst Control Centre for the games executable file, not the launcher it has to be the actual game (e.g not the Fallout launcher but Fallout.exe itself).
In the profile you need to change the “AMD Radeon Dual Graphics” option (at the bottom) for the executable to “AFR friendly” as this will force your computer to use the APU and GPU to render alternative frames and therefore make use of the dual-graphics system.
NOTE: YOU MUST ALSO ENABLE OR FORCE V-SYNC OR YOU WILL SUFFER THE MOST EXTREME SCREEN TEARING YOU HAVE EVER SEEN!
NOTE FOR WINDOWS 8.1 USERS: Most games won’t work with v-sync whether you force it via Catalyst Control Centre or enable it in-game. This is purely to do with the lack of updated drivers for Windows 8.1. To get around this simply run the game executable (not the launcher) in Windows 8 (or 7) compatability mode and you will find v-sync works fine.
Obviously I haven’t tried this method with every single game ever released but every game I have tried it with (I’ve tried at least 25 so far) has shown a noticable improvement on performance (e.g usually frames per second or the ability to put anti-aliasing up another notch without the mouse getting floaty)
Hmm, always makes me wonder why they don’t just enable it without having to make me run through hoops…probably some deal they have with MS in order to push people over to new operating systems.
I’ve been enjoying games with
I’ve been enjoying games with a A10-7850 paired with a R7 250 even without the 13.35 driver, looking forward to it releasing this week though. Dual Graphics works with the 13.30 driver and the benchmarks I’ve run this far show significant performance improvements with it enabled.
My experience thus far has been that this combination handles any game with Medium/High settings at 1080P and High/Ultra settings at 720P (depends on the intensity of the game). The big killer of performance is tesselation, that knocks off 10+FPS itself, oh and TressFX (good hair = bad framerate).
Hi,
I’m running the same set
Hi,
I’m running the same set up as you, but I can’t get dual graphics to work… Which asrock bios version are you running? I seem to be missing the dual graphics option in the uefi.
You can enable dual graphics
You can enable dual graphics in the Catalyst Control Center.
change your RAM to 1800 mhz
change your RAM to 1800 mhz
Still to little to late and
Still to little to late and most everybody I know in the PC Gaming scene is on Nvidia and Intel enjoying the great gaming experiences.
Sorry, I posted this
Sorry, I posted this originally in wrong thread.
Ryan, why was this test not done with the current flagship A10-7850 apu?
While this is good news for AMD, I doubt gamers are going to choose this route (buying more expensive RAM) with that low of a discrete GPU.
Better to just buy a tier or 2 higher on discrete with that money.
You could easily build a rig at same price point with an AMD Vishera paired with a 650ti boost or used 660ti and smash those Kaveri bencharks and not have to deal with frame pacing.
Also pricing a rig without OS, case, and psu is totally inaccurate and I wish you’d stop doing that.
At the time of testing I did
At the time of testing I did not have the 7850K.
dear lord all is lost in the
dear lord all is lost in the comment section. read more and enjoy this article for what it is. two people out of eleven that have a clue?
Although AMD felt the R7 250
Although AMD felt the R7 250 DDR3 was the “sweet spot”, can you tell me why? DDR5 is always better at the same price, isn’t it?
Is it necessary to match the number of SIMD units? As you pointed out, the units in the card are 50% faster. But if you do have to match units, why isn’t there a 250X with 512 units to mate with the flagship chip and the future A10-7800?
To all that are downplaying DG because you can step up to a faster card at the same price, keep in mind you are forced to use a larger case and use a bigger PSU with more cost, heat, etc. There is at least one R7 250 Low Profile that requires no PSU connector and probably would run fine with a 65W processor and SSD on my bundled, slim case 300W PSU.
My guess – ddr3 is preferred
My guess – ddr3 is preferred for the graphics card because that is what the a8-7600 is using. Crossfire usually works best when the individual units are roughly equal.
wouldnt it be better to get a
wouldnt it be better to get a gddr5 r 250 & detune it a bit to better match the igp?
I seem to recall u can opt as to which gpu is default, & i presume this means u use that gpuS display ports.
Is it logical, that the more powerful of the pair of gpuS, would be the sensible choice, as it presumably has more work to do? – so those over head processes should narrow the gpuS performance gap.
ie, take a punt on the ~: cheaper, mainstream gddr5 discrete gpuS, & find a way of loading it a bit more (or detuning it) vs the weaker igp.
keep benching the gpuS till both get similar results, then hybrid crossfire them.
if it doesnt pan out, better to be stuck w/ gddr5 junk than gddr3 junk.
I like reading these reviews
I like reading these reviews and was waiting for the A8-7600 to come out so I could use it in my new LAN rig. Too bad AMD still hasn’t released the real star price/performance part for this gen. I settled for a 760k that I will couple with my old 5850. I was looking forward to see how devs would have used the “compute cores” on this in the future, guess ill have to wit for the next gen…
Thanks for the review!
Thanks for the review! Well-written and has the useful graph data we need. I’ve been hoping to hear about progress on this.
“Higher frame rates don’t
“Higher frame rates don’t really mean much if half of those frames are thrown away, not visible, and not affecting user experience.”
The question is why the benchmark tool doesn’t take user experience into account. It reminds me that Japnese speaker systems look good on frequency response curves but don’t sound good.
hey, wondering when you are
hey, wondering when you are going to hold intel’s feet to the fire for their abysmal ‘frame latency’ performance? Or are you just going to continue to let them slide under the radar? Nice standards PCPER
The other thing that ticked
The other thing that ticked me off to no end, is that most of the time in a Crossfire configuration is that damned ULPS feature would keep the second 7850 completely shut until there was a lot of action going on screen in a game. Case and point, the second 7850 would not power up in games like Diablo 3, C&C 3, unless there was a lot of action going on with various enemies on screen trying to kill me all at once. In Skyrim, the game was smooth until there was lag spikes, and then I alt tabbed out of the game and had a look at the ATI properties and guess what, most of the time the second 7850 wasn’t even running. There’s no excuse for this kind of garbage from AMD/ATI and this all started with the HD 6000 series of graphics chips and continues on in to the R200 series of chips, as far as I know. I wish this was a joke, but if one does a search using a search engine such as Google, one will find all over the internet that ULPS is one big royal pain in the butt in a Crossfire configuration.
The only way AMD cards are truly going to work decent in a Crossfire configuration in my opinion, is if and when AMD gets the lead out of their collective butts, and puts out a bios update for their past cards which includes the 6000 series cards, the 7000 series cards, and the R200 series cards that completely shuts off ULPS. This will most likely never happen.
I’m starting to wonder if and when this site will take AMD to task over the ULPS issues that people have been reporting on various forums all over the internet for some time, even on AMD’s own forums, but AMD has done nothing about.
These ULPS issues, are the other reason I’m bailing out on AMD, I’ve had enough ULPS shutting off the second 7850 on me when I’m trying to get a decent gaming session in.
It’s great to hear that AMD
It’s great to hear that AMD is getting its APU and descrete GPU graphics matching fixed, but AMD should also work towards getting its APUs working with its Highend Descrete GPUs, even if the APU graphics is only used for gaming engine Physics, and other GPGPU type acceleration! if Kaveri can use its integrated GPU along side the APU’s CPU, and produce a core i5 level of computational perfomence, and pair with a high end AMD descrete GPU(doing the graphics) then price wise AMD will be able to compete for the mid range desktop market dollars.
The problem is that a user
The problem is that a user with a higher end GPU wouldn’t be buying a Kaveri part.
They’d probably be rocking an FX chip.
Well yes, if the AMD GPU was
Well yes, if the AMD GPU was the top end, but AMD APU’s integrated graphics need to be paired with the equivalent AMD descrete GPUs (of equal graphics power), someone needing more power than the R7 250, but not the top end AMD discrete SKU, would still be able to benifit from Kaveri’s HSA use of the builtin GPU for extra compute, and extra decoding boost for HTPC, etc, and mid range gaming, where the i5 is strong, and the Intel GPU just takes up space, as there are no intel drivers to be able to utilize their intigrated GPUs for other GPGPU loads while using a descrete AMD or Nvidia GPU for gamingor other uses.
I’ve really been interested
I’ve really been interested in the progress of this tech and I really like the reviews here. I looked at the test methodology- which is good- but I’ve noticed that the test titles are mostly first-person. The trend seems to be away from RTS and titles like Rome Total War II where myriads of objects are rendered and, if I understand it correctly, really stress the GPU memory more. I may have missed it- but is that part of your methodology?
really good to see this
really good to see this solution finally working well,
I wonder if using a 250 GDDR5 would add more variance or simply % lower scaling but the same smoothness, since the performance differential between IGP and VGA is bigger.
nice podcast, i think though
nice podcast, i think though that the push with AMD to go APU means we need to look at the APU in 2 lights- the low cost build as above, and a better build a mainstream user would use.
in looking over the prices i wonder what a change in the video card to even the paltry gigabyte 2 gb rev2 R7 260x would do
because it really looks like the discreet core is dead for AMD, and unless we see something come out of the APU side- no one who is running AMD now will stay in that camp when they change out their computers next time
Nice write up – could hardly
Nice write up – could hardly tell that your an nVidia fanboy and AMD hater.