We’ve known for a while that Intel’s Haswell processor would continue to drive GPU performance in a significant way. With Haswell, Intel will offer a higher end graphics configuration with more execution resources than before (GT3) as well as an even higher end offering that pairs this GPU with 128MB of embedded DRAM on the CPU package (GT3e). Intel’s performance target for the highest end configuration (GT3e) is designed to go up against NVIDIA’s GeForce GT 650M, a performance target it will hit and miss depending on the benchmark. 

Regardless of whether or not it wins every benchmark against the GT 650M, the fact that an Intel made GPU can be talked about in the same sentence as a performance mainstream part from NVIDIA is a big step forward. Under no circumstances could Intel compete with NVIDIA on performance and still do so under the Intel HD Graphics brand. Haswell is the beginning of a new era for Intel. The company is no longer a CPU company forced into graphics, but with Haswell Intel begins its life as a GPU company as well. As a GPU company, Intel needs a strong GPU brand. AMD has Radeon, NVIDIA has GeForce, and now Intel has Iris.
The brand is a nod to a long forgotten history of 3D graphics, as well as an obvious reference to the fact that GPUs are used for very visual purposes. Before OpenGL was well, open, it was a project known as IrisGL. 
Intel is doing the right thing with Iris and only using it to refer to its absolute best graphics options. Intel HD Graphics will remain, and will refer to all GT1/GT2 and some GT3 configurations with Haswell. Iris and Iris Pro will be used to refer to high end GT3 and GT3e configurations:
Anything with GT3e will have Intel’s Iris Pro 5200 graphics, while 28W SKUs with GT3 will have vanilla Iris 5100 (non-Pro). Any 15W SKUs with GT3 will be HD Graphics 5000, and GT2/GT1 parts will also be identified as Intel HD Graphics.
We know what makes Iris Pro special (128MB of eDRAM), but it appears that the main difference between Iris 5100 and HD 5000 is max GPU clock speed. Intel wants Iris associated with performance, which is a very good thing. Having Iris reach down into non-GT3e parts as well is a bit unsettling but at least GT3e gets the Pro designation. 
Intel claims Iris/Iris Pro will deliver up to 2x better performance than Intel’s HD 4000 graphics in notebooks, and it’s using 3DMark11 to validate that claim:

Although Ultrabooks (now 15W) won’t get full blown Iris performance, they should still see a healthy increase in GPU performance compared to where they are today (50% improvement in 3DMark) at a lower TDP. The move to a full speed GT3 part (Iris) should more than double performance in 3DMark. Also worth noting is the fact that we now have a 28W ULT part (i7-4558U). This is a part designed for larger Ultrabooks (14/15"+) that would otherwise have a low end discrete GPU.

Iris Pro will be exclusive to quad-core parts, and the advantage there grows to 2.5x in 3DMark11. There's a slight increase in processor TDP here, but obviously much better performance. That last bar is with the i7-4950HQ running with its TDP set to 55W (cTDP up).
BGA (non-socketed) desktops will also have the option of using Iris Pro if you get an R-series SKU. The performance gains there over Intel HD 4000 are even more impressive thanks to the additional TDP headroom:

QuickSync performance will also improve as a function of increased number of EUs. There’s also DX11.1, OpenCL 1.2, OpenGL 4.0 and 4Kx2K support.
This is really the beginning of a new era. Intel isn't talking specifics about power savings here, but that's really where Iris and Iris Pro will shine. The fact that performance will finally be reasonable enough to actually play games is just icing on the cake. Kicking discrete GPUs out of non-gaming focused notebooks and replacing them with Iris Pro parts should keep performance high while significantly reducing power consumption. 

With Iris Intel is finally committed to graphics, and that's a very good thing.
Comments Locked


View All Comments

  • ssiu - Wednesday, May 1, 2013 - link

    Would Windows 8 tablets use the 15W Haswell processors, or would they all shift to the ~10W Y-series processors for better battery life but same-or-worse performance?

    And it seems many (most?) tablets use single-channel memory (non-upgradable to dual-channel) thus severely handicapping GPU performance. Would that reverse with Haswell tablets? All the hard work going into improving GPU performance being negated by such idiot move ...
  • teiglin - Thursday, May 2, 2013 - link

    Do we actually know that Intel is going to be selling Y-series Haswell parts? I had assumed that the Y SKUs were a gimmick to hold people over until Haswell had legit 10W parts.
  • HodakaRacer - Thursday, May 2, 2013 - link

    I believe they are due around October, although I don't remember where I heard that. I believe there will be parts ~7w for tablets and such. perfect timing for the surface pro 2 :)
  • jeffkibuule - Wednesday, May 1, 2013 - link

    Haswell tablets need to focus on getting decent battery life before they start pursing better GPU performance. 4 hours of typical usage is not enough.
  • HodakaRacer - Thursday, May 2, 2013 - link

    They have atom for that. I obviously have not used haswell yet but it sounds like a good balance of performance and power for me.
  • axien86 - Thursday, May 2, 2013 - link

    Countdown to benchmarks between Intel and AMD Richland/Jaguar APUs comparing:

    Latest DX11.1 games at 1920x1080 and higher resolutions with 3-6 monitors

    GPU compute performance with OpenCL

    Crossfire capability with Intel and AMD

    Finally, compare Intel and AMD Richland/Jaguar APU prices...
  • HodakaRacer - Thursday, May 2, 2013 - link

    "Latest DX11.1 games at 1920x1080 and higher resolutions with 3-6 monitors"

    haha, if only
  • JarredWalton - Thursday, May 2, 2013 - link

    We'll run our standard laptop benchmarks, which means 1366x768 Medium, 1600x900 High, and 1920x1080 Ultra settings for games. I'll see about testing OpenCL as well, if it works properly on Iris/Haswell iGPUs. As for CrossFire capability, I'll test it when possible, but let me tell you: Dual Graphics is not all it's cracked up to be. I'll have an article on the current Trinity + HD 7670M performance in the not-too-distant future, but even when it works, scaling is much lower than desktop CF.
  • mayankleoboy1 - Thursday, May 2, 2013 - link

    If anything, i want detailed review and test of the QuickSync 3.0 .:)
  • whyso - Thursday, May 2, 2013 - link

    "not all its cracked up to be"

    Are you referring to the people who say its good or the people who say its a gimmick.
    Would really like to see that review, its hard to find any hybrid crossfire benchmarks.

Log in

Don't have an account? Sign up now