The run up to Computex has been insane. Kabini, Haswell and Iris hit us back to back to back, not to mention all of the travel before receiving those products to get briefed on everything. Needless to say, we're in major catchup mode. There's a lot more that I wanted to do with Haswell desktop that got cut out due to Iris, and much more I wanted to do with Iris that I had to scrap in order to fly out to Computex. I will be picking up where I left off later this month, but with WWDC, Samsung and a couple of NDA'd events later this month, it's not going to be as quick as I'd like.

One part that arrived while I was in the middle of launch central was AMD's Richland for desktop. Effectively a refresh of Trinity with slightly higher clocks, a software bundle and more sophisticated/aggressive turbo. Richland maintains socket compatibility with Trinity (FM2), so all you should need is a BIOS update to enable support for the chip. AMD sent over two Richland parts just before I left for Computex: the 100W flagship A10-6800K and the 65W A10-6700. I didn't have time to do Richland justice before I left, however I did make sure to test the 6800K in tandem with Haswell's GPU just so I had an idea of how things would stack up going forward as I was writing my Iris Pro conclusion.

For all intents and purposes, Iris Pro doesn't exist in the desktop space, making Haswell GT2 (HD 4600) the fastest socketed part with discrete graphics that Intel ships today. In our Haswell desktop review I didn't get a chance to really analyze HD 4600 performance, so I thought I'd take this opportunity to refresh the current state of desktop integrated processor graphics. Unlike the staggered CPU/GPU launch of Trinity on the desktop, the situation with Richland is purely a time limitation on my end. This was all I could put together before I left for Computex.

Although Richland comes with a generational increase in model numbers, the underlying architecture is the same as Trinity. We're still talking about Piledriver modules and a Cayman derived GPU. It won't be until Kaveri that we see GCN based processor graphics from AMD at this price segment (Kabini is already there).

As Jarred outlined in his launch post on Richland, the 6800K features 4 - 8% higher CPU clocks and a 5% increase in GPU clocks compared to its predecessor. With improved Turbo Core management, AMD expects longer residency at max turbo frequencies but you shouldn't expect substantial differences in performance on the GPU side. The A10-6800K also includes official support for DDR3-2133. AMD is proud of its valiation on the A10-6800K, any parts that won't pass at DDR3-2133 are demoted to lower end SKUs. I never spent a ton of time testing memory overclocking with Trinity, but my A10-5800K sample had no issues running at DDR3-2133 either. I couldn't get DDR3-2400 working reliably however.

AMD Elite A-Series Desktop APUs, aka Richland
Model A10-6800K A10-6700 A8-6600K A8-6500 A6-6400K A4-4000
Modules/Cores 2/4 2/4 2/4 2/4 1/2 1/2
CPU Base Freq 4.1 3.7 3.9 3.5 3.9 3.0
Max Turbo 4.4 4.3 4.2 4.1 4.1 3.2
TDP 100W 65W 100W 65W 65W 65W
Graphics HD 8670D HD 8670D HD 8570D HD 8570D HD 8470D ?
GPU Cores 384 384 256 256 192 128
GPU Clock 844 844 844 800 800 724
L2 Cache 2x2MB 2x2MB 2x2MB 2x2MB 1MB 1MB
Max DDR3 2133 1866 1866 1866    
Price (MSRP) $150 ($142) $149 ($142) $120 ($112) $119 ($112) $80 $46

Just to put things in perspective, here are the previous generation Trinity desktop APUs:

AMD Trinity Desktop APUs
Model A10-5800K A10-5700 A8-5600K A8-5500 A6-5400K A4-5300
Modules/Cores 2/4 2/4 2/4 2/4 1/2 1/2
CPU Base Freq 3.8 3.4 3.6 3.2 3.6 3.4
Max Turbo 4.2 4.0 3.9 3.7 3.8 3.6
TDP 100W 65W 100W 65W 65W 65W
Graphics HD 7660D HD 7660D HD 7560D HD 7560D HD 7540D HD 7480D
GPU Cores 384 384 256 256 192 128
GPU Clock 800 760 760 760 760 723
L2 Cache 2x2MB 2x2MB 2x2MB 2x2MB 1MB 1MB
Max DDR3 2133 1866 1866 1866    
Current Price $130 $129 $110 $105 $70 $55

For my Richland test platform I used the same Gigabyte UD4 Socket-FM2 motherboard I used for our desktop Trinity review, simply updated to the latest firmware release. I ran both AMD platforms using the same Catalyst 13.6 driver with the same DDR3-2133 memory frequency. AMD was quick to point out that only the A10-6800K ships with official DDR3-2133 support, so the gap in performance between it and Trinity may be even larger if the latter tops out at DDR3-1866. The HD 4000/4600 numbers are borrowed from my Iris Pro review using DDR3-2400, however I didn't notice scaling on Haswell GT2 beyond DDR3-1866.

I'll be following up with a more thorough look at Richland once I'm back from my current bout of traveling.

Gaming Performance
Comments Locked


View All Comments

  • whatthehey - Thursday, June 6, 2013 - link

    I can't imagine anyone really wanting minimum quality 1080p over Medium quality 1366x768. Where it makes a difference in performance, the cost in image quality is generally too great to be warranted. (e.g. in something like StarCraft II, the difference between Low and Medium is massive! At Low, SC2 basically looks like a high res version of the original StarCraft.) You can get a reasonable estimate of 1080p Medium performance by taking the 1366x768 scores and multiplying by .51 (there are nearly twice as many pixels at 1080p as at 1366x768). That should be the lower limit, so in some games it may only be 30-40% slower rather than 50% slower, but the only games likely to stay above 30FPS at 1080p Medium are older titles, and perhaps Sleeping Dogs, Tomb Raider, and (if you're lucky) Bioshock Infinite. I'd be willing to wager that relative performance at 1080p Medium is within 10% of relative performance at 1366x768 Medium, though, so other than dropping FPS the additional testing wouldn't matter too much.
  • THF - Friday, June 7, 2013 - link

    You're wrong. Most people who take Starcraft 2 seriously are actually playing on the highest resolution they can get, with the low detail setting. Sure, the game looks flashier, but it's easier to play with less detail. All pros do it.

    Also, as for myself, I like to have the games running on native resolution of the display. It makes "alt-tabbing" (or equivalent thereof on Linux) much more responsive.
  • Calinou__ - Friday, June 7, 2013 - link

    +1, "Low" in today's AAA games is far from ugly if you keep the texture detail to the maximum.
  • tential - Thursday, June 6, 2013 - link

    This is my BIGGEST pet peeve with some reviewers who will test 1080p and only show those results when testing these types of chips. All of the frame rates will be unplayable yet they'll try to draw "some conclusion" from the results. Test resolutions where the minimum frame rate is like 20-25 fps by the contenders so I can see how smooth it actually will be when I play.

    I didn't purchase an IGP solution to play 10 FPS games at 1080p. I purchased it to play low resolution at OK frame rates.
  • zoxo - Thursday, June 6, 2013 - link

    I always start by setting the game to my display's native res (1080p), and then find out at what settings can I achieve passable performance. I just hate non-native resolution too much :(
  • taltamir - Thursday, June 6, 2013 - link

    because you render at a lower res and upscale with iGPUs
  • jamyryals - Thursday, June 6, 2013 - link

    Love that Die render on the first page. It's dumb, but I always like seeing those.
  • Homeles - Thursday, June 6, 2013 - link

    It's not dumb. You can learn a lot about CPU design from them.
  • Bakes - Thursday, June 6, 2013 - link

    Not that it really matters but I think he's saying it's dumb that he always likes seeing those.
  • Gigaplex - Thursday, June 6, 2013 - link

    Homeles' comment could be interpreted in a way that agrees with you and says it's not dumb to like seeing them because you can learn lots.

Log in

Don't have an account? Sign up now