Introduction to FreeSync and Adaptive Sync

The first time anyone talked about adaptive refresh rates for monitors – specifically applying the technique to gaming – was when NVIDIA demoed G-SYNC back in October 2013. The idea seemed so logical that I had to wonder why no one had tried to do it before. Certainly there are hurdles to overcome, e.g. what to do when the frame rate is too low, or too high; getting a panel that can handle adaptive refresh rates; supporting the feature in the graphics drivers. Still, it was an idea that made a lot of sense.

The impetus behind adaptive refresh is to overcome visual artifacts and stutter cause by the normal way of updating the screen. Briefly, the display is updated with new content from the graphics card at set intervals, typically 60 times per second. While that’s fine for normal applications, when it comes to games there are often cases where a new frame isn’t ready in time, causing a stall or stutter in rendering. Alternatively, the screen can be updated as soon as a new frame is ready, but that often results in tearing – where one part of the screen has the previous frame on top and the bottom part has the next frame (or frames in some cases).

Neither input lag/stutter nor image tearing are desirable, so NVIDIA set about creating a solution: G-SYNC. Perhaps the most difficult aspect for NVIDIA wasn’t creating the core technology but rather getting display partners to create and sell what would ultimately be a niche product – G-SYNC requires an NVIDIA GPU, so that rules out a large chunk of the market. Not surprisingly, the result was that G-SYNC took a bit of time to reach the market as a mature solution, with the first displays that supported the feature requiring modification by the end user.

Over the past year we’ve seen more G-SYNC displays ship that no longer require user modification, which is great, but pricing of the displays so far has been quite high. At present the least expensive G-SYNC displays are 1080p144 models that start at $450; similar displays without G-SYNC cost about $200 less. Higher spec displays like the 1440p144 ASUS ROG Swift cost $759 compared to other WQHD displays (albeit not 120/144Hz capable) that start at less than $400. And finally, 4Kp60 displays without G-SYNC cost $400-$500 whereas the 4Kp60 Acer XB280HK will set you back $750.

When AMD demonstrated their alternative adaptive refresh rate technology and cleverly called it FreeSync, it was a clear jab at the added cost of G-SYNC displays. As with G-SYNC, it has taken some time from the initial announcement to actual shipping hardware, but AMD has worked with the VESA group to implement FreeSync as an open standard that’s now part of DisplayPort 1.2a, and they aren’t getting any royalties from the technology. That’s the “Free” part of FreeSync, and while it doesn’t necessarily guarantee that FreeSync enabled displays will cost the same as non-FreeSync displays, the initial pricing looks quite promising.

There may be some additional costs associated with making a FreeSync display, though mostly these costs come in the way of using higher quality components. The major scaler companies – Realtek, Novatek, and MStar – have all built FreeSync (DisplayPort Adaptive Sync) into their latest products, and since most displays require a scaler anyway there’s no significant price increase. But if you compare a FreeSync 1440p144 display to a “normal” 1440p60 display of similar quality, the support for higher refresh rates inherently increases the price. So let’s look at what’s officially announced right now before we continue.

FreeSync Displays and Pricing
Comments Locked

350 Comments

View All Comments

  • Cerb - Saturday, March 21, 2015 - link

    If it's not working, this is just as wrong. Since it's fairly close, at 24, 25, or almost 30, you will see the tear line creeping up or down the image, if vsync isn't on. It's exceptionally obvious. Usually, you will just see skipped frames on Windows, since the compositor forces vsync for the desktop, and this is generally well-supported by any video player's playback mechanisms. The skipped frames become more noticeable as you watch, but aren't nearly as bad as tearing.
  • looncraz - Saturday, March 21, 2015 - link

    Tearing can happen anytime.

    I'm writing a compositing engine for HaikuOS and I would LOVE to be able to control the refresh timing! when a small update occurs, and the frame buffer is ready, I'd swap it, trigger a monitor refresh, and then be on my way right away.

    As it stands, I have to either always be a frame behind, or try and guess how long composing the frame buffer from the update stream will take before I know anything about what the update stream will be like so I know when to wake up the composite engine control loop.

    That means, even on normal day-to-day stuff, like opening a menu, dragging icons, playing solitaire, browsing the web, etc. FreeSync would be quite useful. As it stands, the best I can do is hope the frame is ready for the next interval, or wait until the next refresh is complete to swap frame buffers - which means that the data on screen is always a frame out of date (or more).

    At 60hz that is a fixed delay multiplier of 16.7, with a minimum multiplicand of 1. Going with higher refresh rates on the desktop is just wasteful (we don't really need 60, except for things to feel smooth due to the delay multiplier effect of the refresh rate).

    If I could use the whole range from 45hz to 75 hz, our (virtual) multiplicand could be 0.75-1.33, instead of exactly 1 or 2. That make a significant difference in jitter.

    Everything would be smoother - and we could drop down to a 45hz refresh interval by default, saving energy in the process, instead of being stuck at at a fixed cadence.
  • Cerb - Saturday, March 21, 2015 - link

    Wrong. it is generating the visuals, and doing so the exact same way, as far as any of this technology is concerned, and screen tearing does happen, because refresh rates vary from our common ones.
  • soccerballtux - Friday, March 20, 2015 - link

    considering the power saving impact it's had on the mobile sector (no sense rendering to pixels that haven't changed, just render to the ones that have), it most definitely would have a significant impact on the laptop market and would be a great 'green' tech in general.
  • erple2 - Friday, March 20, 2015 - link

    No value, except to the consumer that doesn't have to pay the (current) $160+ premium for g-sync. Now, if amd had a gfx card competitor to the gtx980, it'd be marvelous, and a no brainer. Given that the cost is apparently minimal to implement, I don't see that as a problem. Even if you think it's not value added, panel manufacturers shoved the pointless 3d down everyone's throat, so clearly, they're not averse to that behavior.
  • mdriftmeyer - Sunday, March 22, 2015 - link

    It has value for any animated sequence.
  • JonnyDough - Monday, March 23, 2015 - link

    Inside of gaming it has plenty of value - who even cares about the rest? Gaming was a $25.1 billion market in 2010 (ESA annual report). I'd take a billionth of that pie and go out for a nice meal wouldn't you?
  • dragonsqrrl - Thursday, March 19, 2015 - link

    ... No current or upcoming DP spec ...requires... adaptive sync. It's optional, not sure how else you could interpret that, especially when you take the comment I responded to into consideration.
  • eddman - Friday, March 20, 2015 - link

    Wait a minute; that only applies to monitors, right? It'd suck to buy a DP 1.2a/3 video card and find out that it cannot do adaptive-sync.
  • tobi1449 - Friday, March 20, 2015 - link

    Plus FreeSync != Adaptive Sync

Log in

Don't have an account? Sign up now