Powered By Blogger

Monday, April 15, 2013


CrossFire Versus SLI Scaling: Does AMD's FX Actually Favor GeForce?



We've heard it said before that AMD's GPUs are more platform-dependent than Nvidia's. So, what happens when you drop a Radeon and a GeForce into an FX-8350-based system? Does AMD's CPU get in the way of its GPU running as well as it possibly could?
For years, we heard that ATI's graphics cards are more platform-dependent than Nvidia's and, depending on who had the fastest processor at the time, should really be used with that CPU. So, when AMD's highest-end processors started falling further and further behind Intel's quickest models, we weren't surprised whenNvidia started introducing AMD-compatible chipsets. Intel even forged a similar partnership with ATI, and we looked forward to the RD600 platform overshadowing Intel's own 975X as the premiere enthusiast chipset for Conroe-based processors. 
Many of us were confused when AMD decided to buy ATI rather than solidify its ties to Nvidia. Intel abandoned ATI's RD600 altogether and went off to develop X38 Express. Nvidia eventually dropped out of the PC chipset business entirely. But enthusiasts still took comfort in the notion that AMD’s acquisition might carry it through the rough times ahead. ATI was, after all, slightly more competitive.
Now that AMD and ATI are integrated (as well as two large companies can be after several years), we'd expect its CPU and GPU technologies to be extensively optimized for each other. Nevertheless, the suggestions continue that Radeon cards need more processing power behind them to achieve their performance potential. If that's true, the implication is that whenever one of our Intel-based platforms shows a Radeon and GeForce card performing similarly, an AMD-based system would actually show the GeForce performing better. Wait. What?
We began our tests with an evaluation of clock rate and its effect on CrossFire in FX Vs. Core i7: Exploring CPU Bottlenecks And AMD CrossFire. Intel started out at a lower frequency and consequently had the most to gain. AMD couldn’t go very far beyond its stock clock rate without more exotic cooling, so it had the least to gain.
At the end of the day, both of our CPUs ended up at comparable clock rates with similarly-little effort, making that article a great head-to-head match. But that slight speed-up from AMD meant that a second GeForce-based article with the same CPU settings wouldn’t have given us very much new information. So, I decided to jump straight to the point: Does AMD’s flagship FX processor, overclocked, favor Nvidia graphics?

No comments:

Post a Comment