AMD Shows Off Multiple FreeSync Displays
by Jarred Walton on January 8, 2015 12:43 AM ESTWe met with AMD and among other things, one item they wanted to show us was the essentially final versions of several upcoming FreeSync displays. Overall AMD and their partners are still on target to launch FreeSync displays this quarter, with AMD telling us that as many as 11 displays could hit the market before the end of March. For CES AMD had several displays running, including a 28” 60Hz 4K display from Samsung, a 27” 144Hz QHD display from BenQ, and a 75Hz 2560x1080 34” display from LG. The three displays mentioned were all running on different GPUs, including an R9 285 for the BenQ, R9 290X for the Samsung display, and an A10-7850K APU was powering the LG UltraWide display.
More important than the displays and hardware powering them is the fact that FreeSync worked just as you’d expect. AMD had serveral demos running, including a tearing test demo with a large vertical block of red moving across the display, and a greatly enhanced version of their earlier windmill demo. We could then enable/disable FreeSync and V-SYNC, we could set the target rendering speed from 40 to 55 Hz in 5Hz increments, or we could set it to vary (sweep) over time between 40 Hz and 55 Hz. The Samsung display meanwhile was even able to show the current refresh rate in its OSD, and with FreeSync enabled we could watch the fluctuations, as can be seen here. [Update: Video of demo has been added below.]
Having seen and used G-SYNC, there was nothing particularly new being demonstrated here, but it is proof that AMD’s FreeSync solution is ready and delivering on all of AMD's feature goals, and it should be available in the next few months. Meanwhile AMD also took a moment to briefly address the issue of minimum framerates and pixel decay over time, stating that the minimum refresh rate each monitor supports will be on a per-monitor basis, and that it will depend on how quickly pixels decay. The most common outcome is that some displays will have a minimum refresh rate of 30Hz (33.3ms) and others with pixels quicker to decay will have a 40Hz (25ms) minimum.
On the retail front, what remains to be seen now is just how much more FreeSync displays will cost on average compared to non-FreeSync displays. FreeSync is royalty free, but that doesn’t mean that there are not additional costs involved with creating a display that works with FreeSync. There’s a need for better panels and other components which will obviously increase the BoM (Bill of Materials), which will be passed on to the consumers.
Perhaps the bigger question though will be how much FreeSync displays end up costing compared to G-SYNC equivalents, as well as whether Intel and others will support the standard. Meanwhile if FreeSync does gain traction, it will also be interesting to see if NVIDIA begins supporting FreeSync, or if they will remain committed to G-SYNC. Anyway, we should start to see shipping hardware in the near future, and we’ll get answers to many of the remaining questions over the coming year.
118 Comments
View All Comments
dcoca - Thursday, January 8, 2015 - link
Sup dude, something makes me think u work at Nvidia: happy new yearsFlushedBubblyJock - Tuesday, February 24, 2015 - link
Sup coca -Makes me think you wish you had an AMD card but can't afford it yet, so you hate rich nVidia users like me.chizow - Thursday, January 8, 2015 - link
There are still a lot of unanswered questions about FreeSync beyond what is covered here, such as cost, lag/latency, and framerates beyond max refresh. In PCPer's analysis, they mentioned a potential deal-breaker with FreeSync when framerates exceed the monitor's refresh rate. They mention it was a "design decision" by AMD to either force the user to enable Vsync in this situation, or to suffer from screen tearing, thus negating the benefit of variable/dynamic refresh. I'm really not sure why AMD didn't implement a soft/driver framerate cap like Nvidia did with G-Sync.There are also questions about input lag/latency, as G-Sync has held up favorably when tested independently against the fastest panels with V-Sync off. We will see have to see how FreeSync fares against them.
In the end, it will simply come down to which solution is better, as usual. As I expected months ago, and as it looks to be shaping up from these early demos, G-Sync appears to be the superior solution, and as such, it will command a justified premium price. These features on monitors will just end up being another checkbox feature that develop each vendor's ecosystem and influence a buying decision, such is the nature of competition.
@Jarred: did AMD mention when you might be getting some review samples? Look forward to seeing some tests and general impressions!
Gigaplex - Thursday, January 8, 2015 - link
"In PCPer's analysis, they mentioned a potential deal-breaker with FreeSync when framerates exceed the monitor's refresh rate. They mention it was a "design decision" by AMD to either force the user to enable Vsync in this situation, or to suffer from screen tearing, thus negating the benefit of variable/dynamic refresh. I'm really not sure why AMD didn't implement a soft/driver framerate cap like Nvidia did with G-Sync."Uh, what do you want it to do? How is a framerate cap any better than v-sync capping the framerate?
chizow - Friday, January 9, 2015 - link
@Gigaplex: the answer is quite simple, you can have a framerate cap WITHOUT Vsync, because in no way are they mutually dependent. The downside of course on traditional displays is that you still get screen tearing, but on a dynamic refresh rate monitor, you won't get tearing, you get the most recent, full frame WITHOUT the associated input lag.G-Sync does this, with FreeSync it is less clear because it appears they are still relying on some form of V-sync (most likely triple buffering for sub-native refresh) which reverts and breaks above scaler supported refresh rates, forcing them to enable V-Sync at the application level.
Honestly, to me it looks like they've basically implemented Nvidia's older frame rate control technology Adaptive Vsync in hardware at the scaler level without the "automatic" switching of Vsync On/Off. As a refresher, Adaptive Vsync was introduced by Nvidia with Kepler where it automatically turned Vsync with triple buffering on below the native refresh rate of a monitor, and disabled it above native refresh rate. I am sure this work was instrumental in coming out with their ultimate solution, G-Sync, which bypasses Vsync and the problems associated with it altogether.
Creig - Thursday, January 8, 2015 - link
G-sync only works within a certain frequency range as well. It's no different than FreeSync in that respect. And AMD is developing a Dynamic Frame Rate Control feature for their drivers. So G-sync and FreeSync do appear to be fairly closely matched.Well, except for the fact that G-sync requires a custom FPGA board to be installed (extra cost), requires this FPGA board to be tuned to the panel of the monitor it's going to be installed in (extra cost) and requires a licensing fee to be paid to Nvidia (extra cost). Since the only additional cost for FreeSync is an updated scaler, G-sync will always be more expensive than FreeSync.
Not to mention G-sync locks you into nvidia GPUs only while FreeSync is based off an industry standard which any company is free to develop. And Adaptive-Sync (which FreeSync is based off) also has power savings benefits. Since this is an industry standard, I would expect Intel to eventually jump on the bandwagon as well.
Right out of the gate, FreeSync is looking to eclipse G-sync. AMD has reported that there could be as many as eleven different FreeSync capable monitors on the market by the end of March. How many models of G-sync monitors have been released in the 12 months since G-sync came out? Seven?
Yes, we need objective reviews between G-sync and FreeSync before we can make a meaningful comparison. But G-sync seems far from being "the superior solution" as you put it.
invinciblegod - Thursday, January 8, 2015 - link
Theres no point in touting that freesync an open standard until other adopt it. There are so many open standards that are not used and is basically proprietary is someone uses it. MicroUSB type A for example.chizow - Thursday, January 8, 2015 - link
G-Sync works in a wider frequency range, and by design, it can't go out of this range, never forcing the user to make a choice between negating the benefit of dynamic refresh by enabling V-Sync or suffer tearing, like FreeSync does. And yes, AMD has recently been touting its DFRC feature as a built-in framecap, but its curious why they didn't just make this a softcap for FreeSync enabled globally. Its most likely because FreeSync is still tied to and limited by the monitor refresh so exceeding the refresh intervals reverts back to tearing, but I am sure we will get more indepth info on this once they actually hit the market or review samples go out.The G-Sync module with custom FPGA and onboard DRAM is necessary, but also most likely why Nvidia ends up with the better solution with G-Sync. Its hard to emulate in software what is being done in hardware, which is what I think AMD is running into with their FreeSync solution. Not to mention, we still don't know the price difference between the "expensive" G-Sync module vs. the custom, premium Scalers AMD and their partners took 8-12 months to bring to market, but they certainly have changed their tune from the days they were claiming FreeSync would be an essentially free upgrade that might only need a firmware update on existing monitors. :D
And FreeSync locks you in just as much as G-Sync, as long as no one else supports FreeSync, its in the exact same boat. Worst boat actually, since Nvidia commands an overwhelming majority of the dGPU market at a 70/30 rate for supported GPUs, and not even all AMD GPUs in the same generational span (again, I hope I don't need to redefine this for you) are even capable of supporting FreeSync. Meanwhile, all Nvidia GPUs from Kepler onward support G-Sync. Intel has shown no interest in FreeSync, which is really no surprise given its main benefit is gaming, a market Intel doesn't really address with their IGPs, and they got all they needed out of the eDP spec when they invented and developed it for reduced refresh power saving years ago in their laptop displays.
And right out of the gate FreeSync is looking to eclipse G-Sync? Haha how so? Nvidia has just as many G-Sync panels on the market TODAY as AMD has ANNOUNCED, and from what we have seen already, FreeSync is inferior to G-Sync in every way but *POSSIBLY* cost. We don't even know if FreeSync addresses the other major problem associated with V-Sync, input lag/latency, until reviewers and interest sites get their hands on them. So yes, I guess a solution that is already demonstrably inferior, is still an unknown commodity in terms of cost, is still not available for purchase and still has major questions about its compatibility and implementation has somehow eclipsed a solution that is available today, and does everything it claimed to do, from Day 1 in G-Sync? Fanboy fantasy at its finest.
But yes, we do need more object reviews to further dissect and expose the differences between G-Sync and FreeSync. G-Sync has been on the market for about a year and has held up against such scrutiny, all the questions are surrounding FreeSync, but from what we have already seen, it is falling short.
Creig - Thursday, January 8, 2015 - link
The stated compatible refresh rates for FreeSync are from 9Hz to 240Hz. I haven't been able to find any upper or lower limits listed for G-sync, but I fail to see how it could possibly work "in a wider frequency range" than FreeSync as you claim it can.FreeSync does not require a custom FPGA board, does not require custom tuning for each panel, does not require that royalties be paid back to AMD. Therefore, given two otherwise identical monitors, the FreeSync display will always be less expensive than the G-sync alternative. That's a huge win for FreeSync right there. And you are wrong about AMD claiming that it would only be a free upgrade for existing monitors. They explicitly stated that although their test model monitor was able to utilize FreeSync with only a firmware update, most likely this firmware would NOT be available through the manufacturer.
From the Anandtech "FreeSync Monitor Prototype" article:
"At this point AMD is emphasizing that while they were able to get FreeSync up and running on existing hardware, owners shouldn’t be expecting firmware updates as this is very unlikely to happen (though this is ultimately up to monitor manufacturers. Instead AMD is using it to demonstrate that existing panels and scalers already exist that are capable of variable refresh, and that retail monitors should not require significant/expensive technology upgrades."
You can claim that AMD's FreeSync "locks" you in to their hardware, but only because Nvidia has stated that they refuse to support Adaptive-Sync in favor of their proprietary version. G-sync is not an industry standard. Adaptive-Sync is an industry standard. We don't know yet whether or not Intel plans to support AS in the future, but it would be almost inconceivable for them not to. Adaptive-Sync benefits lower end systems more than dedicated gaming machines. Laptops and business computers (more often than not) fall into the "lower end system" category. And Intel sells a LOT of hardware that ends up in laptops and entry level computers.
Naturally not all AMD video cards will support FreeSync. Just as not all Nvidia cards support G-sync. There will always be a cutoff line for hardware that simply will not be compatible with new technology. For G-sync, it's Kepler and newer. For AMD, I believe it's Tahiti and newer. I don't know where you're trying to go with this one. It's simply the way of things in the technology world that not all old models will support new features.
And I stand by my assertion that FreeSync appears to have a better potential for success than G-sync. It's taken Nvidia an entire year to get as many models out as AMD has announced that will support FreeSync within months. And while Nvidia may have a majority in the dGPU market, it has only a 20% share in the overall GPU market with AMD taking 20% and Intel holding 60%. And as it's the lower end systems that will be typically running at lower frame rates, it's these systems that will be utilizing the benefits of Adaptive-Sync more often than dedicated gaming computers. Intel has not yet weighed in on whether or not they will eventually support Adaptive-Sync, but being an industry standard, they may do so whenever they wish. Nvidia is free to adapt their GPUs for AS as well. That's the beauty of it being an industry standard.
I haven't read any FreeSync reviews yet and I'm almost certain you haven't either. In fact, you have absolutely NO IDEA if there is any visual difference whatsoever between G-sync and FreeSync. To make a claim such as "FreeSync is inferior to G-Sync in every way but *POSSIBLY* cost" without any reviews at all to back yourself up simply makes you look daft. I plan to wait until the professional reviewers get done with their comparisons before I'll make any claims as to the performance of FreeSync vs Gsync. But I'll stand by my previous assertion that FreeSync's future is already looking brighter than Gsync.
Creig - Thursday, January 8, 2015 - link
Well, HotHardware had a crew at CES who looked at the AMD FreeSync displays and had the following to say:"AMD expects there to be a total of 11 FreeSync displays available by March at varying refresh rates, resolutions, and panel sizes, including IPS panel options and the aforementioned 144Hz gaming panels. Obviously a full comparison between G-Sync and FreeSync will have to wait for head-to-head hardware, but our team reports that the two standards appeared to perform identically.
Assuming that continues to be true, AMD could have an advantage with this feature -- FreeSync reportedly doesn't add any additional cost to display panels, whereas the ASIC hardware required for G-Sync reportedly increases panel retail cost by ~$150. Of course, it's ultimately up to the manufacturers themselves whether or not to charge a premium for FreeSync monitors -- there's just no baked-in cost increase from specialized hardware."
As I said, FreeSync is already looking pretty good. Just waiting on the official reviews now.