AMD Kaveri FX-7600P GPU Performance Preview

Given the 3DMark results we just showed as well as the increase in CPU performance, I was very interested to see what Kaveri could do in terms of gaming performance. Here I have to temper my comments somewhat by simply noting that the graphics drivers on the prototype laptops did not appear to be fully optimized. One game in particular that I tested (Batman: Arkham Origins) seemed to struggle more than I expected, and there are other games (Metro: Last Light and Company of Heroes 2) that will bring anything short of a mainstream dGPU to its knees. I've posted the Kaveri Mainstream and Enthusiast scores in Mobile Bench, but they're not particularly useful as most of the scores are below 30 FPS. Here, I'll focus on our "Value" settings, which are actually still quite nice looking (Medium detail in most games).

Bioshock Infinite - Value

Company of Heroes 2 - Value

Elder Scrolls: Skyrim - Value

GRID 2 - Value

Metro: Last Light - Value

Sleeping Dogs - Value

Tomb Raider - Value

As expected, in most of the tests the Kaveri APU is able to surpass the gaming performance of every other iGPU, and in some cases it even comes moderately close to a discrete mainstream dGPU. There's a sizeable gap between the Trinity/Richland APUs and Kaveri in most of the games I tested, which is great news for those looking for a laptop that won't break the bank but can still run most modern games.

Getting into the particulars, Skyrim seems to be hitting some bottleneck (possibly CPU, though even then I'd expect Kaveri to be faster than Richland), but the vast majority of games should run at more than 30 FPS. There was one system with a pre-release Mantle driver installed that was running Battlefield 4 reasonably well at low/medium details, and with shipping laptops and drivers (and perhaps DDR3-2133 RAM) I suspect even Metro might get close to 30 FPS. Of course, we're only looking at the top performance FX-7600P here, so we'll have to see what the various 19W APUs are able to manage in similar tests.

AMD Kaveri FX-7600P System/CPU Performance Preview Initial Thoughts
Comments Locked


View All Comments

  • gdansk - Wednesday, June 4, 2014 - link

    Surprisingly decent performance.
  • shing3232 - Wednesday, June 4, 2014 - link

  • basroil - Wednesday, June 4, 2014 - link

    Decent? The TDP is twice that of the Intel offerings and it's still slower as a CPU. Mobile gaming might be decent, but we still don't know how the system scales down (could lose performance faster than TDP)
  • marvee - Wednesday, June 4, 2014 - link

    I vaguely remember reading how Intel TDP /= AMD TDP. The crux of the article claimed that Intel TDP represented average consumption, while AMD TDP represented the max theoretical consumption. Anyone familiar with the topic?
  • Vayra - Wednesday, June 4, 2014 - link

    Well, looking at how GT3(e) is set up in Haswell, it seems likely that Intel's TDP measurements are going to be close to the maximum usage as well in any graphical application. The graphics parts will push even cpu usage down if TDP limits are reached there, so...
  • Gondalf - Wednesday, June 4, 2014 - link

    You are wrong marvee. This is an hold topic of many forums and the problem was solved long time ago. AMD TDP is equal to Intel TDP because "TDP" means a precise thing for OEMs. Go to a simple Wiki page and get a clue.
  • formulav8 - Wednesday, June 4, 2014 - link

    TDP is Thermal Design Power. In other words its not how much wattage it uses, its a guideline that AMD and Intel publish for the system builders.
  • gngl - Thursday, June 5, 2014 - link

    "its not how much wattage it uses"

    Unless the laws of thermodynamic have changed somehow, I'd expect TDP to put an upper bound on a CPU's power consumption. So, yes, it is about how much wattage it uses in the worst conceivable steady state. (Or isn't?)
  • kingpin888 - Thursday, June 5, 2014 - link

    The thermal design power (TDP), sometimes called thermal design point, refers to the maximum amount of heat generated by the CPU, which the cooling system in a computer is required to dissipate in typical operation
    So , no its not.
  • silverblue - Wednesday, June 4, 2014 - link

    If you can't say anything nice...

Log in

Don't have an account? Sign up now