The AMD FreeSync Review
by Jarred Walton on March 19, 2015 12:00 PM ESTFreeSync Displays
There are four FreeSync displays launching today, one each from Acer and BenQ, and two from LG. Besides the displays launching today, seven additional displays should show up in the coming weeks (months?). Here’s the current list of FreeSync compatible displays, with pricing where it has been disclosed.
FreeSync Compatible Displays | ||||||
Manufacturer | Model | Diagonal | Resolution | Refresh | Panel | Price |
Acer | XG270HU | 27" | 2560x1440 | 40-144Hz | TN | $499 |
BenQ | XL2730Z | 27" | 2560x1440 | 40-144Hz | TN | $599 |
LG Electronics | 34UM67 | 34" | 2560x1080 | 48-75Hz | IPS | $649 |
LG Electronics | 29UM67 | 29" | 2560x1080 | 48-75Hz | IPS | $449 |
Nixeus | NX-VUE24 | 24" | 1920x1080 | 144Hz | TN | ? |
Samsung | UE590 | 28" | 3840x2160 | 60Hz | TN | ? |
Samsung | UE590 | 23.6" | 3840x2160 | 60Hz | TN | ? |
Samsung | UE850 | 31.5" | 3840x2160 | 60Hz | TN? | ? |
Samsung | UE850 | 28" | 3840x2160 | 60Hz | TN? | ? |
Samsung | UE850 | 23.6" | 3840x2160 | 60Hz | TN? | ? |
Viewsonic | VX2701mh | 27" | 1920x1080 | 144Hz | TN | ? |
The four displays launching today cover two primary options. For those that want higher refresh rates, Acer and BenQ have TN-based 40-144Hz displays. Both are 27” WQHD displays, so it’s quite probable that they’re using the same panel, perhaps even the same panel that we’ve seen in the ASUS ROG Swift. The two LG displays meanwhile venture out into new territory as far as adaptive refresh rates are concerned. LG has both a smaller 29” and a larger 34” 2560x1080 (UW-UXGA) display, and both sport IPS panels (technically AU Optronics' AHVA, but it's basically the same as IPS).
The other upcoming displays all appear to be using TN panels, though it's possible Samsung might offer PLS. The UE590 appears to be TN for certain, with 170/160 degree viewing angles according to DigitalTrends. The UE850 on the other hand is targeted more at imaging professionals, so PLS might be present; we'll update if we can get any confirmation of panel type.
One of the big benefits with FreeSync is going to be support for multiple video inputs – the G-SYNC displays so far are all limited to a single DisplayPort connection. The LG displays come with DisplayPort, HDMI, and DVI-D inputs (along with audio in/out), and the Acer is similarly equipped. Neither one has any USB ports, though the BenQ does have a built-in USB hub with ports on the side.
Our testing was conducted on the 34UM67, and let me just say that it’s quite the sight sitting on my desk. I’ve been bouncing between the ASUS ROG Swift and Acer XB280HK for the past several months, and both displays have their pros and cons. I like the high resolution of the Acer at times, but I have to admit that my aging eyes often struggle when running it at 4K and I have to resort to DPI scaling (which introduces other problems). The ASUS on the other hand is great with its high refresh rates, and the resolution is more readable without scaling. The big problem with both displays is that they’re TN panels, and having come from using a 30” IPS display for the past eight years that’s a pretty painful compromise.
Plopping the relatively gigantic 34UM67 on my desk is in many ways like seeing a good friend again after a long hiatus. “Dear IPS (AHVA), I’ve missed having you on my desktop. Please don’t leave me again!” For the old and decrepit folks like me, dropping to 2560x1080 on a 34” display also means reading text at 100% zoom is not a problem. But when you’re only a couple feet away, the relatively low DPI does make the pixels much more visible to the naked eye. It even has built-in speakers (though they’re not going to compete with any standalone speakers in terms of audio quality).
The launch price of $649 is pretty impressive; we’ve looked at a few other 21:9 displays in the past, and while the resolution doesn’t match LG’s 34UM95, the price is actually $50 less than the LG 34UM65’s original $699 MSRP (though it’s now being sold at $599). So at most, it looks like putting in the new technology to make a FreeSync display costs $50, and probably less than that. Anyway, we’ll have a full review of the LG 34UM67 in the coming weeks, but for now let’s return to the FreeSync discussion.
Pricing vs. G-SYNC
It certainly appears that AMD and their partners are serious about pricing FreeSync aggressively, though there aren’t direct comparisons available for some of the models. The least expensive FreeSync displays start at just $449, which matches the least expensive G-SYNC display (AOC G2460PG) on price but with generally better specs (29” 2560x1080 and IPS at 75Hz vs. 24” 1920x1080 TN at 144Hz). Looking at direct comparisons, the Acer XG270HU and BenQ XL2730Z are WQHD 144Hz panels, which pits them against the $759 ASUS ROG Swift that we recently reviewed, giving FreeSync a $160 to $260 advantage. As AMD puts it, that’s almost enough for another GPU (depending on which Radeon you’re using, of course).
Based on pricing alone, FreeSync looks poised to give G-SYNC some much needed competition. And it’s not just about the price, as there are other advantages to FreeSync that we’ll cover more on the next page. But for a moment let’s focus just on the AMD FreeSync vs. NVIDIA G-SYNC ecosystems.
Right now NVIDIA enjoys a performance advantage over AMD in terms of GPUs, and along with that they currently carry a price premium, particularly at the high end. While the R9 290X and GTX 970 are pretty evenly matched, the GTX 980 tends to lead by a decent amount in most games. Any users willing to spend $200 extra per GPU to buy a GTX 980 instead of an R9 290X might also be willing to pay $200 more for a G-SYNC compatible display. After all, it’s the only game in town for NVIDIA users right now.
AMD and other companies can support FreeSync, but until – unless! – NVIDIA supports the standard, users will be forced to choose between AMD + FreeSync or NVIDIA + G-SYNC. That’s unfortunate for any users that routinely switch between AMD and NVIDIA GPUs, though the number of people outside of hardware reviewers that regularly go back and forth is miniscule. Ideally we’d see one standard win out and the other fade away (i.e. Betamax, HD-DVD, etc.), but with a one year lead and plenty of money invested it’s unlikely NVIDIA will abandon G-SYNC any time soon.
Prices meanwhile are bound to change, as up to now there has been no competition for NVIDIA’s G-SYNC monitors. With FreeSync finally available, we expect prices for G-SYNC displays will start to come down, and in fact we’re already seeing $40-$125 off the original MSRP for most of the G-SYNC displays. Will that be enough to keep NVIDIA’s proprietary G-SYNC technology viable? Most likely, as both FreeSync and G-SYNC are gamer focused more than anything; if a gamer prefers NVIDIA, FreeSync isn’t likely to get them to switch sides. But if you don’t have any GPU preference, you’re in the market for a new gaming PC, and you’re planning on buying a new monitor to go with it, R9 290X + FreeSync could save a couple hundred dollars compared to GTX 970 + G-SYNC.
There's something else to consider with the above list of monitors as well: four currently shipping FreeSync displays exist on the official day of launch, and Samsung alone has five more FreeSync displays scheduled for release in the near future. Eleven FreeSync displays in the near term might not seem like a huge deal, but compare that with G-SYNC: even with a one year lead (more or less), NVIDIA currently only lists six displays with G-SYNC support, and the upcoming Acer XB270HU makes for seven. AMD also claims there will be 20 FreeSync compatible displays shipping by the end of the year. In terms of numbers, then, DP Adaptive Sync (and by extension FreeSync) look to be winning this war.
350 Comments
View All Comments
chizow - Thursday, March 19, 2015 - link
See link: http://www.pcper.com/image/view/54234?return=node%...Also: still unaddressed concerns with how and why FreeSync is still tied to Vsync and how this impacts latency.
happycamperjack - Thursday, March 19, 2015 - link
The ghosting problem actually has nothing to do with the G-Sync and FreeSync technologies like the article said, but more have to do with the components in the monitor. So if Asus made a ROG Swift FreeSync version of the same monitor, there would've been no ghosting just like the G-SYNC version. So your example is invalid.chizow - Friday, March 20, 2015 - link
@happycamperjack. Again, incorrect. Why is it that panels from the SAME manufacturers, that possibly use the same panels even, using the same prevaling panel technologies of this time, exhibit widely different characteristics under variable refresh? Maybe that magic G-Sync module that AMD claims is pointless is actually doing something....like controlling the drive electronics that control pixel response variably in response to changing framerates. Maybe AMD needs another 18 months to refine those scalers with the various scaler mfgs?http://www.pcper.com/reviews/Displays/AMD-FreeSync...
"Modern monitors are often tuned to a specific refresh rate – 144 Hz, 120 Hz, 60 Hz, etc. – and the power delivery to pixels is built to reduce ghosting and image defects. But in a situation where the refresh rate can literally be ANY rate, as we get with VRR displays, the LCD will very often be in these non-tuned refresh rates. NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates."
BenQ for example makes a fine G-Sync monitor, and multiple high refresh 3D Vision monitors well known for their lack of ghosting. Are you going to tell me that suddenly they are using inferior panel tech that can't handle ghosting? This is 2015 and TN panels we are talking about here right? This kind of ghosting has not been seen since circa 2007 when PMVA was all the rage.
AnnihilatorX - Thursday, March 19, 2015 - link
chizow stop your biased preconceptions and actually read the articleAnnihilatorX - Thursday, March 19, 2015 - link
I will summarize it for you in case your prejudice clouds your comprehension1) At no point in the article it finds any performance advantage from FreeSync or Gsync (AMD claims 0.5-1% advantage but that's too small to detect, so we disregard that)
2) Freesync has better monitor choices, including IPS and ones with better specs in general
3) Freesync monitors are about USD200 cheaper, almost half the cost of a decent graphic card
4) Freesync monitors have on-screen dialogues (OSD) that works, Gsync monitor doesn't due to implementation
5) Freesync has better potential in future for support, especially laptops becuase of zero royalty fees and only minor update to hardware
6. Freesync allows users the option to choose whether they want to enable Vsync or not, Gsync locks Vsync to be on. This mean the user can have better latency if they can stand tearing. The important thing is option, having the option is always advantageous
7. AMD claims Freesync works from 9Hz-240Hz wheras Gsync only works from 30Hz to 144Hz.
chizow - Thursday, March 19, 2015 - link
@AnnihilatorX1) You assume the tests conducted here are actually relevant.
2) No, they don't. Nvidia has an IPS in the works that may very well be the best of all, but in the meantime, it is obvious that for whatever reason the FreeSync panels are subpar compared to the G-Sync offerings. Coutesy of PCPER: http://www.pcper.com/image/view/54234?return=node%...
3) Sure they are cheaper, but they also aren't as good, and certainly not "Free" as there is a clear premium compared to non-FreeSync panels, and certainly no firmware flash is going to change that. Also, that $200 is going to have to be spent on a new AMD graphics GCN1.1+ graphics card anyways as anyone who doesn't already own a newer AMD card will have to factor that into their decision. Meanwhile, G-Sync supports everything from Nvidia from Kepler on. Nice and tidy (and dominant in terms of installed user base).
4) OSDs, scalers and such add input lag, while having multiple inputs is nice, OSDs are a feature gaming purists can live without (See: all the gaming direct input modes on newer LCDs that bypass the scalers).
5) Not if they're tied to AMD hardware. They can enjoy a minor share of the dGPU graphics market as their TAM.
6) Uh, this is nonsense. FreeSync is still tied to Vsync in ways THIS review certainly doesn't cover indepth, but that's certainly not going to be a positive since Vsync inherently adds latency. Meanwhile, Vsync is never enabled with G-Sync, and while there is more latency at the capped FPS, it is a driver-side cap and not Vsync enabled.
7) Well, AMD can claim all they like it goes as low as 9Hz but as we have seen the implementation is FAR worst, falling apart below 40FPS where blurring, tearing, basically the image falls apart and everything you invested hundreds of dollars basically became a huge waste. Meanwhile, G-Sync shows none of these issues, and I play some MMOs that regularly dip into the 20s in crowded cities, no sign of any of this.
So yes, as I've shown, there are still many issues with FreeSync that need to be addressed that show it is clearly not as good as G-Sync. But like I said, this is a good introduction to the tech that Nvidia invented some 18 months ago, maybe with another 18 months AMD will make more refinements and close the gap?
lordken - Thursday, March 19, 2015 - link
5) what? Where did you got that Adaptive sync is tied to AMD HW? Thats pretty bullshit, if it would then it wouldnt be standardized by VESA right?If today it is only AMD HW that can support it (cause they implement first) doesnt validate your claim that it is AMD tied. Intel/nvidia/... can implement it in their products if they want.
It is like you would be saying that if for example LG release first monitor that will support DP1.3 that it implies DP1.3 is LG tied lol
On other hand Gsync is Nvidia tied. But you know this right?
chizow - Thursday, March 19, 2015 - link
@lordken, who else supports FreeSync? No one but AMD. Those monitor makers can ONLY expect to get business from a minor share of the graphics market given that is going to be the primary factor in paying the premium for one over a non-FreeSync monitor. This is a fact.anubis44 - Tuesday, March 24, 2015 - link
VESA supports FreeSync, which means Intel will probably support it, too. Intel graphics drive far more computers than AMD or nVidia, which means that if Intel does support it, nVidia is euchred, and even if Intel doesn't support it, many more gamers will choose free over paying an extra $150-$200 for a gaming setup. Between the 390-series coming out shortly and the almost guaranteed certainty that some hacked nVidia drivers will show up on the web to support FreeSync, G-Sync is a doomed technology. Period.chizow - Tuesday, March 24, 2015 - link
Intel has no reason to support FreeSync, and they have shown no interest either. Hell they showed more interest in Mantle but as we all know, AMD denied them (so much for being the open hands across the globe company).But yes I'm hoping Nvidia does support Adaptive Sync as their low-end solution and keeps G-Sync as their premium solution. As we have seen, FreeSync just isn't good enough but at the very least it means people will have even less reason to buy AMD if Nvidia supports both lower-end Adaptive Sync and premium G-Sync monitors.