Although entry level PC notebooks will still ship with a 1366 x 768 panel, the high end of the market is quickly shifting to 1080p and higher resolution panels. Some of the most interesting notebooks at Computex launched with high DPI panels, such as the new Acer Aspire S7 and the ASUS Zenbook Infinity - both with 13.3" 2560 x 1440 displays. These displays offer comparable pixel density to Apple's MacBook Pro with Retina Display.

Pixel Density Comparison

Intel had one of the 2560 x 1440 Aspire S7s at their Computex suite as well as a microscope for showing off the difference in pixel size. With a 33% increase in pixel density, the 2560 x 1440 panel obviously has smaller pixels.

There's nothing new about increasing pixel density, however OS support/integration is just as important to the overall experience as physically outfitting notebooks with high DPI panels. Windows has traditionally done a terrible job of DPI scaling on the desktop, but word around Computex is that this should be fixed with Windows 8.1.

Comments Locked


View All Comments

  • Guspaz - Friday, June 7, 2013 - link

    Well, you give up hinting for antialiasing, but Apple doesn't do AA hinting on text anyhow. If you're doing pure supersampling, stuff is generally going to look good at these pixel densities regardless of the supersample ratio.

    In terms of rendering at higher resolution being a waste of composition power, I'd argue that it's not that burdensome (remember we're not talking about rendering 3D graphics at that resolution, but doing 2D composition), it's exactly what Apple is doing in OS X, and it seems to work quite well... They're using a GF 650M in the 15" retina, which is only a bit faster than the fastest Haswell chips, and they're using an HD 4000 in the 13" retina, which is a great deal slower than the fastest Haswell chips, and things seem to be working out for them.

    I think they should be given some credit for being the first company to pull off a consumer operating system with resolution-independent rendering... As a primarily PC user, I would have really liked to have something like that years ago, to deal with high res notebook displays. Windows DPI scaling never worked very well, however.
  • vegemeister - Monday, August 5, 2013 - link

    Rendering oversized frames and rescaling them means you have to move a lot more data around in memory. This wastes energy, because high-bandwidth off-chip buses are power hogs. Also, it messes up subpixel AA unless the text rendering library knows what the native resolution of the screen is so it can offset 1/3 of a native resolution pixel, instead of 1/3 of a fake framebuffer pixel.

    >I think they should be given some credit for being the first company to pull off a consumer operating system with resolution-independent rendering...

    Resampling the entire desktop is not resolution-independent rendering.
  • MikhailT - Friday, June 7, 2013 - link

    Have you ever looked at the Retina displays? If you did, you'd understand it's not about the resolution but the pixel density. The 13" retina MacBook Pro screen is actually a "1280x800" screen in the default Retina mode. It's less space than your 13" 1080P screen but it is twice as sharp (not exactly twice but feels like it).

    More pixel density at the same *virtual* resolution means text becomes much shaper without making them smaller. Folks who tried Retina display could not go back to the older 1080P 13" because the text are now blurry to them.

    For an example, 15" resolution: 2880x1800 panel but in Retina mode it is 1440x900, that's how much space we have on 15". Yes, we could switch it to use more space but it's not as sharp.
  • MikhailT - Friday, June 7, 2013 - link

    The point I'm trying to make is that many folks actually want HiDPI screens because it's that easier on their eyes and it's more useful than having higher resolution with everything smaller.

    Many of my Windows friends are waiting for these HiDPI screens.
  • Cold Fussion - Friday, June 7, 2013 - link

    If you're not an expert, then why would you bother saying it is easy? The manufacturing defects of these devices is probabilistic, the larger the size the greater the chance of manufacturing defects. Clearly manufacturing these high density displays is difficult at the best of times, and judging by much even tablet size screens have lagged behind phone screens in pixel density is a clear indication that manufacturing defects are a serious problem, now imagine that same defect rate on a screen with ~3.5x the area (10" to 17") or 5.75x (10" to 24").
  • Death666Angel - Friday, June 7, 2013 - link

    The higher the DPI, the better scaled resolutions look. On 1080p or 900p screens, running a 768p resolution to play a game (because the iGPU can only run it at that resolution with good frame rates) looks pretty bad. But by increasing the DPI, you make those resolutions look okay to good again. Also, in the case of 1440p/1600p, you can just run the game at 720p/800p and have perfect scaling. And this will force MS and other developers to put some thought into good scaling for their OS/programs.
  • MrSpadge - Friday, June 7, 2013 - link

    If games rendered the UI at native resolution and everything else at whatever-the-GPU-can-handle we wouldn't have such problems at all. Wouldn't sell new GPUs, though.
  • Torrijos - Friday, June 7, 2013 - link

    First of all there are physics in play here...

    For someone with perfect eyesight (not everybody but still a good benchmark) they should be able to distinguish 2 pixels in a 13" retMBP under 38cm (15") that is a good working distance for a laptop.
    Higher densities would be probably useless, deciding factor being how often do you work closer to your display than the definition distance.
    For old displays (1440x900 13.3") that distance becomes 68cm (27").
  • JDG1980 - Friday, June 7, 2013 - link

    Have you ever actually used an Apple "retina display"? Once you try an iPad 4, the text on PC screens looks fuzzy and indistinct in comparison. Text rendering at HiDPI resolutions is really beautiful.

    And from a technical point of view, HiDPI will eventually let us phase out hacks like hinting and antialiasing on fonts. None of this is done on printers (300-1200 DPI) - it's purely to allow acceptable results at the shamefully low resolutions we currently suffer through on our monitors.
  • vegemeister - Monday, August 5, 2013 - link

    We already can phase out hinting (as we should, because it looks terrible). To phase out antialiasing, you'd have to render at a resolution so high that the spatial bandwidth cutoff of the human eye becomes the AA filter in most of the populace (99th percentile would probably be good). That would look only marginally better than using a lower resolution (say, 60th percentile) and using AA, and you'd have to shuffle much bigger buffers around, and use significantly brighter backlights. This wastes power.

Log in

Don't have an account? Sign up now