FreeSync Displays

There are four FreeSync displays launching today, one each from Acer and BenQ, and two from LG. Besides the displays launching today, seven additional displays should show up in the coming weeks (months?). Here’s the current list of FreeSync compatible displays, with pricing where it has been disclosed.

FreeSync Compatible Displays
Manufacturer Model Diagonal Resolution Refresh Panel Price
Acer XG270HU 27" 2560x1440 40-144Hz TN $499
BenQ XL2730Z 27" 2560x1440 40-144Hz TN $599
LG Electronics 34UM67 34" 2560x1080 48-75Hz IPS $649
LG Electronics 29UM67 29" 2560x1080 48-75Hz IPS $449
Nixeus NX-VUE24 24" 1920x1080 144Hz TN ?
Samsung UE590 28" 3840x2160 60Hz TN ?
Samsung UE590 23.6" 3840x2160 60Hz TN ?
Samsung UE850 31.5" 3840x2160 60Hz TN? ?
Samsung UE850 28" 3840x2160 60Hz TN? ?
Samsung UE850 23.6" 3840x2160 60Hz TN? ?
Viewsonic VX2701mh 27" 1920x1080 144Hz TN ?

The four displays launching today cover two primary options. For those that want higher refresh rates, Acer and BenQ have TN-based 40-144Hz displays. Both are 27” WQHD displays, so it’s quite probable that they’re using the same panel, perhaps even the same panel that we’ve seen in the ASUS ROG Swift. The two LG displays meanwhile venture out into new territory as far as adaptive refresh rates are concerned. LG has both a smaller 29” and a larger 34” 2560x1080 (UW-UXGA) display, and both sport IPS panels (technically AU Optronics' AHVA, but it's basically the same as IPS).

The other upcoming displays all appear to be using TN panels, though it's possible Samsung might offer PLS. The UE590 appears to be TN for certain, with 170/160 degree viewing angles according to DigitalTrends. The UE850 on the other hand is targeted more at imaging professionals, so PLS might be present; we'll update if we can get any confirmation of panel type.

One of the big benefits with FreeSync is going to be support for multiple video inputs – the G-SYNC displays so far are all limited to a single DisplayPort connection. The LG displays come with DisplayPort, HDMI, and DVI-D inputs (along with audio in/out), and the Acer is similarly equipped. Neither one has any USB ports, though the BenQ does have a built-in USB hub with ports on the side.

Our testing was conducted on the 34UM67, and let me just say that it’s quite the sight sitting on my desk. I’ve been bouncing between the ASUS ROG Swift and Acer XB280HK for the past several months, and both displays have their pros and cons. I like the high resolution of the Acer at times, but I have to admit that my aging eyes often struggle when running it at 4K and I have to resort to DPI scaling (which introduces other problems). The ASUS on the other hand is great with its high refresh rates, and the resolution is more readable without scaling. The big problem with both displays is that they’re TN panels, and having come from using a 30” IPS display for the past eight years that’s a pretty painful compromise.

Plopping the relatively gigantic 34UM67 on my desk is in many ways like seeing a good friend again after a long hiatus. “Dear IPS (AHVA), I’ve missed having you on my desktop. Please don’t leave me again!” For the old and decrepit folks like me, dropping to 2560x1080 on a 34” display also means reading text at 100% zoom is not a problem. But when you’re only a couple feet away, the relatively low DPI does make the pixels much more visible to the naked eye. It even has built-in speakers (though they’re not going to compete with any standalone speakers in terms of audio quality).

The launch price of $649 is pretty impressive; we’ve looked at a few other 21:9 displays in the past, and while the resolution doesn’t match LG’s 34UM95, the price is actually $50 less than the LG 34UM65’s original $699 MSRP (though it’s now being sold at $599). So at most, it looks like putting in the new technology to make a FreeSync display costs $50, and probably less than that. Anyway, we’ll have a full review of the LG 34UM67 in the coming weeks, but for now let’s return to the FreeSync discussion.

Pricing vs. G-SYNC

It certainly appears that AMD and their partners are serious about pricing FreeSync aggressively, though there aren’t direct comparisons available for some of the models. The least expensive FreeSync displays start at just $449, which matches the least expensive G-SYNC display (AOC G2460PG) on price but with generally better specs (29” 2560x1080 and IPS at 75Hz vs. 24” 1920x1080 TN at 144Hz). Looking at direct comparisons, the Acer XG270HU and BenQ XL2730Z are WQHD 144Hz panels, which pits them against the $759 ASUS ROG Swift that we recently reviewed, giving FreeSync a $160 to $260 advantage. As AMD puts it, that’s almost enough for another GPU (depending on which Radeon you’re using, of course).



Based on pricing alone, FreeSync looks poised to give G-SYNC some much needed competition. And it’s not just about the price, as there are other advantages to FreeSync that we’ll cover more on the next page. But for a moment let’s focus just on the AMD FreeSync vs. NVIDIA G-SYNC ecosystems.

Right now NVIDIA enjoys a performance advantage over AMD in terms of GPUs, and along with that they currently carry a price premium, particularly at the high end. While the R9 290X and GTX 970 are pretty evenly matched, the GTX 980 tends to lead by a decent amount in most games. Any users willing to spend $200 extra per GPU to buy a GTX 980 instead of an R9 290X might also be willing to pay $200 more for a G-SYNC compatible display. After all, it’s the only game in town for NVIDIA users right now.

AMD and other companies can support FreeSync, but until – unless! – NVIDIA supports the standard, users will be forced to choose between AMD + FreeSync or NVIDIA + G-SYNC. That’s unfortunate for any users that routinely switch between AMD and NVIDIA GPUs, though the number of people outside of hardware reviewers that regularly go back and forth is miniscule. Ideally we’d see one standard win out and the other fade away (i.e. Betamax, HD-DVD, etc.), but with a one year lead and plenty of money invested it’s unlikely NVIDIA will abandon G-SYNC any time soon.

Prices meanwhile are bound to change, as up to now there has been no competition for NVIDIA’s G-SYNC monitors. With FreeSync finally available, we expect prices for G-SYNC displays will start to come down, and in fact we’re already seeing $40-$125 off the original MSRP for most of the G-SYNC displays. Will that be enough to keep NVIDIA’s proprietary G-SYNC technology viable? Most likely, as both FreeSync and G-SYNC are gamer focused more than anything; if a gamer prefers NVIDIA, FreeSync isn’t likely to get them to switch sides. But if you don’t have any GPU preference, you’re in the market for a new gaming PC, and you’re planning on buying a new monitor to go with it, R9 290X + FreeSync could save a couple hundred dollars compared to GTX 970 + G-SYNC.

There's something else to consider with the above list of monitors as well: four currently shipping FreeSync displays exist on the official day of launch, and Samsung alone has five more FreeSync displays scheduled for release in the near future. Eleven FreeSync displays in the near term might not seem like a huge deal, but compare that with G-SYNC: even with a one year lead (more or less), NVIDIA currently only lists six displays with G-SYNC support, and the upcoming Acer XB270HU makes for seven. AMD also claims there will be 20 FreeSync compatible displays shipping by the end of the year. In terms of numbers, then, DP Adaptive Sync (and by extension FreeSync) look to be winning this war.

Introduction to FreeSync and Adaptive Refresh FreeSync Features
Comments Locked

350 Comments

View All Comments

  • lordken - Thursday, March 19, 2015 - link

    are you sure? they only have to come with different name (if they want). Just as both amd/nvidia calls and use "DisplayPort" as DisplayPort , they didnt have came up with their own implementations of it as DP is standardized by VESA so they used that.
    Or I am missing your point what you wanted to say.

    Question is if it become core/regular part of lets say DP1.4 onwards as just now it is only optional aka 1.2a and not even in DP1.3 - if I understand that correctly.
  • iniudan - Thursday, March 19, 2015 - link

    Well the implementation need to be in their driver, they not gonna give that to Nvidia. =p
  • chizow - Thursday, March 19, 2015 - link

    So it is also closed/proprietary on an open spec? Gotcha, so I guess Nvidia should just keep supporting their own proprietary solution. Makes sense to me.
  • ddarko - Thursday, March 19, 2015 - link

    You know repeating a falsehood 100 times doesn't make it true, right?
  • chizow - Tuesday, March 24, 2015 - link

    You mean like repeating FreeSync can be made backward compatible with existing monitors with just a firmware flash, essentially for Free? I can't remember how many times that nonsense was tossed about in the 15 months it took before FreeSync monitors finally materialized.

    Btw, it is looking more and more like FreeSync is a proprietary implementation based on an open-spec just as I stated. FreeSync has recently been trademarked by AMD so there's not even a guarantee AMD would allow Nvidia to enable their own version of Adaptive-Sync on FreeSync (TM) branded monitors.
  • ddarko - Thursday, March 19, 2015 - link

    From the PC Perspective article you've been parroting around like gospel all day today:

    "That leads us to AMD’s claims that FreeSync doesn’t require proprietary hardware, and clearly that is a fair and accurate benefit of FreeSync today. Displays that support FreeSync can use one of a number of certified scalars that support the DisplayPort AdaptiveSync standard. The VESA DP 1.2a+ AdaptiveSync feature is indeed an open standard, available to anyone that works with VESA while G-Sync is only offered to those monitor vendors that choose to work with NVIDIA and purchase the G-Sync modules mentioned above."

    http://www.pcper.com/reviews/Displays/AMD-FreeSync...

    That's the difference between an open and closed standard, as you well know but are trying to obscure with FUD.
  • chizow - Friday, March 20, 2015 - link

    @ddarko, it says a lot that you quote the article but omit the actually relevant portion:

    "Let’s address the licensing fee first. I have it from people that I definitely trust that NVIDIA is not charging a licensing fee to monitor vendors that integrate G-Sync technology into monitors. What they do charge is a price for the G-Sync module, a piece of hardware that replaces the scalar that would normally be present in a modern PC display. It might be a matter of semantics to some, but a licensing fee is not being charged, as instead the vendor is paying a fee in the range of $40-60 for the module that handles the G-Sync logic."

    And more on that G-Sync module AMD claims isn't necessary (but we in turn have found out a lot of what AMD said about G-Sync turned out to be BS even in relation to their own FreeSync solution):

    "But in a situation where the refresh rate can literally be ANY rate, as we get with VRR displays, the LCD will very often be in these non-tuned refresh rates. NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates."

    In summary, AMD's own proprietary spec just isn't as good as Nvidia's.
  • Crunchy005 - Friday, March 20, 2015 - link

    AMDs spec is not proprietary so stop lying. Also I love how you just quoted in context what you quoted out of context in an earlier comment. The only argument you have against freeSync is ghosting and as many people have pointed out is that it is not an inherent issue with free sync but the monitors themselves. The example given in that shows three different displays that all are affected differently. The LG and benq both show ghosting differently but use the same freeSync standard so something else is different here and not freeSync. On top of that the LG is $100 less than the asus and the benQ $150 less for the same features and more inputs. I don't see how a better more well rounded monitor that can offer variable refresh rates with more features that is cheaper is a bad thing. From the consumer side of things that is great! A few ghosting issues that i'm sure are hardly noticeable to the average user is not a major issue. The videos shown there are taken at a high frame rate and slowed down, then put into a compressed format and thrown on youtube in what is a very jerky hard to see video, great example for your only argument. If the tech industry could actually move away from proprietary/patented technology, and maybe try to actually offer better products and not "good enough" products that force customers into choosing and being locked into one thing we could be a lot father along.
  • chizow - Friday, March 20, 2015 - link

    Huh? How do you know Nvidia can use FreeSync? I am pretty sure AMD has said Nvidia can't use FreeSync, if they decide to use something with DP 1.2a Adaptive Sync they have to call it something else and create their own implementation, so clearly it is not an Open Standard as some claim.

    And how is it not an issue inherent with FreeSync? Simple test that any site like AT that actually wants answers can do:

    1) Run these monitors with Vsync On.
    2) Run these monitors with Vsync Off.
    3) Run these monitors with FreeSync On.

    Post results. If these panels only exhibit ghosting in 3), then obviously it is an issue caused by FreeSync. Especially when we have these same panel makers (the 27" 1440p BenQ is apparently the same AU optronics panel as the ROG Swift) have panels on the market, both non-FreeSync and G-Sync) that have no such ghosting.

    And again you mention the BenQ vs. the Asus, well guess what? Same panel, VERY different results. Maybe its that G-Sync module doing its magic, and that it actually justifies its price. Maybe that G-Sync module isn't bogus as AMD claimed and it is actually the Titan X of monitor scalers and is worth every penny it costs over AMD FreeSync if it is successful at preventing the kind of ghosting we see on AMD panels, while allowing VRR to go as low as 1FPS.

    Just a thought!
  • Crunchy005 - Monday, March 23, 2015 - link

    Same panel different scalers. AMD just uses the standard built into the display port, the scaler handles the rest there so it isn't necessarily freeSync but the variable refresh rate technology in scaler that would be causing the ghosting. So again not AMD but the manufacturer.

    "If these panels only exhibit ghosting in 3), then obviously it is an issue caused by FreeSync"
    Haven't seen this and you haven't shown us either.

    "Maybe its that G-Sync module doing its magic"
    This is the scaler so the scaler, not made by AMD, that supports the VRR standard that AMD uses is what is controlling that panel not freeSync itself. Hence an issue outside of AMDs control. Stop lying and saying it is an issue with AMD. Nvidia fanboys lying, gotta keep them on the straight and narrow.

Log in

Don't have an account? Sign up now