Our second calibration target is designed for print work, with a light output of 80 nits instead of 200 nits, and the sRGB gamma curve instead of the 2.2 Power Curve.

The grayscale continues to be excellent, even better than with the first calibration but you wouldn’t be able to tell even if they were side-by-side. There is a bit of a spike in the gamma that keeps it from being perfect, but even that spike doesn’t cause the dE2000 to rise at all. The biggest problem is that the contrast ratio has fallen down to 488:1 from the prior 820:1 level. Possibly the LED backlight doesn’t dim enough for the darkest settings, or the maximum light output was still too high and the LUTs had to be heavily lowered, either of which will cause a loss of contrast ratio. That issue aside, the grayscale calibration is very good.

Color gamut is almost identical to at 200 nits. Too low of luminance in green, too little saturation in red, otherwise pretty good.

The colorchecker data here really surprises me. The yellow-orange shades that have been causing us such issues are no longer a problem. Now that shade of blue that lies on the edge of the sRGB target is causing the issue. I imagine the blue issue is related to the lack of green luminance, which affects cyan shades because of that. At lower light output levels, the error level in blue readings can be higher, since their light output is low to begin with, and if the green component is low in luminance, that can cause a much larger drop in the light output, leading to a larger error. Looking at the DeltaL chart for this measurement shows that it is quite low in luminance, which is almost certainly causing most of that color error. As to why orange-yellow shades improved so much, I have absolutely no idea.

Saturations are much like they are on the last calibration, with 100% having the largest error and the numbers falling from there. 100% numbers are harder to correct since those saturations might be lacking luminance or saturation, which can’t be corrected. If a saturation below 100% is lacking saturation, we can increase that to compensate, which will reduce error, but that can’t be done with the 100% values.

Overall the sRGB target calibration is also very good, with a couple issues that came up. The biggest concern to me is that the contrast ratio drops so much, which will lead to a flatter, more washed-out image in appearance, but I didn’t see a way to correct it in several more calibration attempts.

Calibration - 200 Nits, 2.2 Gamma Target Display Uniformity
Comments Locked

79 Comments

View All Comments

  • mdrejhon - Tuesday, June 18, 2013 - link

    The XL2720T has better color quality than the VG248QE.
    Someone owns both monitors, and reported this.
    The VG278H is actually pretty competitive to the XL2720T, despite its age.

    What makes them really worth it, is the LightBoost.
  • Death666Angel - Tuesday, June 18, 2013 - link

    Wow, this review badly needs a table of the specs on the first page.
  • brandonicus - Tuesday, June 18, 2013 - link

    I hate to be "that guy" but I found it really annoying you assumed we knew what the resolution was... unless I'm blind the only place it was mentioned was in the "Posted in" header and the seventh and eighth page. I feel like something that important should be mentioned upfront.
  • blackoctagon - Tuesday, June 18, 2013 - link

    Thanks for the review, Chris, but WHY exactly did you choose to measure input lag using the Leo Bodnar test? Apart from the fact that it cannot measure the screen's performance at 120Hz (the refresh rate at which this screen is designed to be played), the test itself seems to not have undergone the same verification as, say, PRAD.de's use of an oscilloscope has...for a review that starts out with a discussion about input lag, and even mentions that you were "still in search of" the ideal test, I expected to hear your reasoning for choosing this methodology over others.
  • cheinonen - Tuesday, June 18, 2013 - link

    I actually talked to TFT Central about this, as they have an oscilloscope method as well (which is beyond my means, unfortunately). They've tested multiple ways and feel the Leo Bodnar winds up as the most accurate version out there right now as well, other than a scope method. SMTT was working relatively well, but it has a license, and he stopped selling them. Our license expired, so I can't use it anymore.

    Searching for a totally accurate, and affordable, lag measurement device continues. I'll look into the Audrino solution that was mentioned here and see how that looks.
  • blackoctagon - Wednesday, June 19, 2013 - link

    Thank you for the reply. Looking forward to seeing where this search leads you
  • mdrejhon - Wednesday, June 19, 2013 - link

    I'm the inventor of the Arduino Input Lag Tester, which runs via a USB cable connected to the computer.

    It features:
    - Sub-millisecond accuracy
    - Works at all computer resolutions and refresh rates.
    - USB cable latency compensation (subtracts calculated USB cable latency)
    - Costs only $40 to build.

    It's currently undergoing beta testing, with custom software I have created.
    Contact me at mark[at]blurbusters.com for more information about the Arudino Input Lag Tester.
  • blackoctagon - Thursday, June 20, 2013 - link

    Interesting. But is it 'Audrino,' 'Arduino' or 'Arudino' test? :)
    I see all three (mis-?)spellings on this page
  • mdrejhon - Thursday, June 20, 2013 - link

    Apologies. It's a hard word sometimes.
    The correct spelling is Arduino, which refers to an easy-to-program hobbyist microcontroller:
    http://www.arduino.cc/

    It's a home made input lag meter involving (1) Almost any Arduino with a USB port, (2) a photodiode, (3) a resistor, and (4) some wires. It's an open source input lag circuit I've developed that is very easy to build (easier than building a computer -- no soldering iron required!). I'll be publishing some software that makes everything run as an accurate input lag tester (including USB cable latency/jitter compensation), since the assembly is connected to a PC displaying flashing squares.
  • Pastuch - Tuesday, June 18, 2013 - link

    Honestly, this review is a huge let down. When I started reading this website 10 years ago the articles were always informed and well researched. This review is sorely lacking in that regard. The only reason people are still buying 120hz displays is for Lightboost capable 2d gaming. The CS, BF and Quake communities LOVE the CRT like motion response of Lightboost and this is one of the better models to have that capability. http://www.blurbusters.com/ has all the relevant info, Mark is an invaluable resource and I implore you to contact him for more info.

    You complain loudly about IPS color quality in a gaming review but you admit yourself that gaming isn't a hobby you’re interested in. Your conclusion argues that the money could be better spent on an IPS 2560 display. Do you know how many video cards it takes to run Planetside 2 at 2560 at 80FPS+? You need two Geforce 780s! Can I borrow $1200?

    I used to own a 2560x1440 IPS for desktop work but I couldn’t play CS on it due to slow pixel response and horrible input lag. Once you try lightboost there is no going back. The motion clarity at 120fps + on a LB capable display genuinely changes the gameplay experience. I don't own a LB display yet but I've tried it at a lan party. I was blown away and I was hoping that Anand would provide a comprehensive review of the Benq 2720T. With the latest Nvidia drivers and LB enabled, gamers are reporting almost 1000 contrast ratio on the 2720 which is better than any other LB monitor. Lightboost is a genuine boon to the gaming market, there are Sony FW900 owners that say the motion clarity of LB is BETTER than the FW900. Do you have any idea how amazing that is? People have been waiting 10 years for a monitor that can replace the FW900 for twitch-gaming.

    If you want to read solid monitor reviews go to http://www.tftcentral.co.uk/

Log in

Don't have an account? Sign up now