Performance Expectations

In their presentation and FAQ, NVIDIA provided estimates of performance relative to Ivy Bridge Core i5 ULV with HD 4000. Before we get to those numbers, we want to quickly set the stage for what NVIDIA is showing. Despite the similarity in name and features, as we discovered last year the ULV chips tend to run into TDP limits when you’re trying to hit both the CPU and the iGPU. The CPU cores for instance can use around 12-15W and a full load on the iGPU can create another 10-15W; stuff both into a 17W TDP and you’re going to get throttling, which is exactly what happens.

Looking at HD 4000 performance with Core i5 ULV and Core i7 quad-core, you can see that the quad-core part is anywhere from 0% to around 60% faster. On average it’s 35% faster at our 2012 “Value” settings and 26% faster at our 2012 “Mainstream” settings. As for the 700M performance relative to Core i5 ULV, NVIDIA provides the following estimates based on benchmarks at moderate detail and 1366x768 in Battlefield 3, Crysis 2, Just Cause 2, DiRT 3, and F1 2011:

Besides the above slide, NVIDIA provided some performance estimates using results from the 3DMark 11 Performance benchmark, and the results are even more heavily in favor of NVIDIA. In their FAQ, NVIDIA states that even the lowly GeForce 710M is three times faster than ULV HD 4000, while the GT 720M is 3.3x faster, GT 730M and 735M are 4.8X faster (hmmm…do we really need GT 735M?), GT 740M is 5.3X faster, GT 745M is 5.8x faster, and GT 750M is 6.3x faster. Of course, those number are from NVIDIA, going up against the much slower ULV variant of Ivy Bridge, and using 3DMark 11—which isn’t quite as important as actual gaming performance.

I suspect the GT3 and GT3e configurations of Haswell will be substantially faster than IVB’s HD 4000 and may come close to the lower end of NVIDIA’s range…at least on the standard voltage Haswell chips. For ULV, I’ve heard a performance estimates that GT3 Haswell will be 30%-50% faster than GT2 IVB, and GT3e could be roughly twice as fast, but that should still leave NVIDIA with a healthy lead. Anyway, we’d suggest taking all of these numbers with a grain of salt for now. The real comparison for most is going to be Haswell and 700M, and while we have a pretty good idea where 700M and HD 4000 performance fall (since the 700M parts are Kepler and Fermi updates), Haswell’s iGPU is likely to be a different beast.

Closing Thoughts

On the whole, Kepler has been amazingly successful for NVIDIA, particularly in the mobile world. The bar for midrange mobile dGPUs was raised significantly with the GT 640M LE and above parts typically offering anywhere from 25% to 75% better performance than the previous generation, and that was accomplished along with reducing power use. It was NVIDIA’s version of Intel’s Core 2 launch, and the vast majority of notebooks with dGPUs seem to be using NVIDIA hardware these days. Much of that can also be attributed to NVIDIA’s driver team, where Optimus support and usability still trumps AMD’s Enduro alternative. AMD is still working to improve their drivers, but they're still not at the same level as NVIDIA's mobile drivers.

Not surprisingly, it looks like every laptop with an NVIDIA dGPU these days also comes with Optimus support, and NVIDIA says they’ll be in three times as many Ultrabooks and ultraportables in 2013 compared to 2012—which isn’t too hard, since off the top of my head the only two Ultrabooks with NVIDIA dGPUs I can name are the Acer M5 and the ASUS UX32VD. NVIDIA also says they have over 30 design wins for touchscreen laptops, but again considering Windows 8 almost requires a touchscreen to really be useful that’s expected. We will likely see a limited number of laptops launching with Ivy Bridge CPUs and 700M dGPUs over the coming weeks, with ASUS specifically listed in NVIDIA’s 700M FAQ with their X450 (GT 740M) and N46 (GT 740M as well); Lenovo is also a launch day partner with several options: Y400 with GT 750M, and Z400/Z500 with GT 740M.

The real launch is likely to coincide with Intel’s Haswell update later in Q2 2013. When that comes along, we're likely to see some additional 700M updates from NVIDIA on the high end (again, echoing what happened with the 600M and 680M launches). Just don't count on seeing a mobile variant of Titan/GK110 for a while yet; I'd peg that level of performance as something we won't see in laptops until we have two more process shrinks under our belts (i.e. when TSMC is at 16nm).

GeForce 700M Models and Specifications
POST A COMMENT

91 Comments

View All Comments

  • Torrijos - Monday, April 1, 2013 - link

    Hope they'll carry on giving mac user drivers quickly. Reply
  • Jorgisven - Monday, April 1, 2013 - link

    Having spoken with nVidia technical engineers as part of my job, nVidia does not handle drivers for OSX. They "advise", but don't do any of the actual driver writing. Apple does that in-house. Boot Camp Windows, however, follows the same driver update path as everyone else using Windows. Reply
  • Boland - Tuesday, April 2, 2013 - link

    nVidia's job descriptions page says otherwise. They're actually looking at expanding their mac driver team.

    http://www.nvidia.com/page/job_descriptions.html
    Reply
  • cpupro - Tuesday, April 2, 2013 - link

    Yeah right, nVidia is giving specs of their GPU's to Apple developers so they can write GeForce drivers for OSX. nVidia is not crazy to share their knowledge to competition, because for writing drivers you need to know how GPU internally work. Reply
  • kasakka - Thursday, April 4, 2013 - link

    To my understanding it used to be Apple who wrote the drivers but Nvidia has possibly taken the reins back to themselves. There have been some Nvidia driver update releases that are newer than what is found in Apple's updates. Reply
  • TerdFerguson - Monday, April 1, 2013 - link

    I'll never, ever, buy another laptop with a discrete GPU. The extra heat and power drain, together with the inflated prices and dishonest marketing just aren't worth the modest performance increase on a machine that will never really provide the same level of gaming performance that even a dirt cheap desktop machine will.

    If a pair of 680M cards in SLI performs worse than a single 660TI, then it's just plain dishonest for NVidia to keep branding them thusly. I don't see onboard graphics overtaking desktop video boards any time soon, but for laptops the time is near and it can't come soon enough.
    Reply
  • geniekid - Monday, April 1, 2013 - link

    There are a number of laptops that let you switch between discrete and integrated graphics on demand so you can save power when you're on-the-go and still have that extra power when you're plugged in.

    As for value versus desktops, yes there's a premium for mobility and the value of that mobility depends greatly on your lifestyle and job conditions.
    Reply
  • Flunk - Monday, April 1, 2013 - link

    You have a point when it comes to high-end "gaming" laptops that weight 20+ pounds, cost a fortune and perform poorly. But there is a place for mid-range discrete GPUs in smaller systems that allow you to play games at moderate settings if you're on the go.

    I think the best option would be a small laptop that connects to an external GPU but it appears that the industry disagrees with me.
    Reply
  • nehs89 - Monday, April 1, 2013 - link

    I totally agree with you.... all laptops in general and also win8 tablets should connect to an external GPU....that would be the solution to many problems.... you want to play heavy duty games just plug in the external GPU and If you want or need portability then use the ultrabook alone....I have also read that with the current technology this is not possible Reply
  • KitsuneKnight - Monday, April 1, 2013 - link

    Sony shipped a laptop that supported a (low end) external dGPU. Another company showed a generic enclosure that could be used to connect a GPU to a computer via Thunderbolt (I'm not sure if it ever actually shipped, though). It certainly is possible, even if there's currently no link that could provide enough bandwidth to let a top-of-the-line GPU run full tilt.

    I would think nVidia and/or Intel would want to push that market more, but it doesn't seem like anyone really cares, unfortunately. It would be nice to be able to 'upgrade' a laptop's GPU without having to replace the entire thing.
    Reply

Log in

Don't have an account? Sign up now