I'd really like one of these, but I can't really justify $2000 because I know that in 6-months to a year competition will arrive that severely undercuts this price.
That's just technology in general. But keep a eye out, around that time this monitor is coming out with a revision that will remove the "gaming" features" but still maintain refresh rate and size.
The big omission to watch out for is the FALD backlight. Without that, HDR cannot be achieved outside of an OLED panel (and even then OLED cannot yet meet the peak luminance levels). You;re going to see a lot of monitors that are effectively SDR panels with the brightness turned up, and sold as 'HDR'. If you're old enough to remember when HDTV was rolling uout, remember the wave of budget 'HD' TVs that used SD panels but accepted and downsampled HD inputs? Same situation here.
quantum dots increase the color gamut, HDR is about increasing the luminescence range on screen at any time. Edge lit displays only have a handful of dimming zones at most (no way to get more when your control consists of only 1 configurable value per row/column). You need back lighting where each small chunk of the screen can be controlled independently to get anything approaching a decent result. Per pixel is best, but only doable with OLED or jumbotron size displays. (MicroLED - we can barely make normal LEDs small enough for this scale.) OTOH if costs can be brought down microLED does have the potential to power a FALD backlight with an order of magnitude or more more dimming zones than current models LCD can do; enough to largely make halo effects around bright objects a negligible issue.
I think you are exaggerating a bit. HDR is just a transform function. There are several standards that say what the peak luminance should be to considered HDR10 or Dolby Vision etc. But that itself is misleading.
Define " (and even then OLED cannot yet meet the peak luminance levels)" Because OLED can def reach 600+ nits, which is one of the standards for HDR being proposed.
Just A transform function? [Laughs in Hybrid Log Gamma],
Joking aside, HDR is also a set of minimum requirements. Claiming panels that do not even come close to meeting those requirements are also HDR is akin to claiming that 720x468 is HD, because "it's just a resolution". The requirements range far beyond just peak luminance levels, which is why merely slapping a big-ass backlight to a panel and claiming it is 'HDR' is nonsense.
" Just A transform function? [Laughs in Hybrid Log Gamma],"
And HLG is again just a standard of how to handle HDR and SDR. It is not required or needed to display HDR images.
"HDR is also a set of minimum requirements"
No, there are STANDARDS that attempts to address HDR features across products and in video production. But in itself does not mean violating those standards equate to a non-HDR image. Dolby Vision, for example, supports dynamic metadata. HDR10 does not. Does that make HDR10 NOT HDR? Eventually, the market and the industry to congregate behind 1 or 2 SET of standards (since it is not only about 1 number or feature). But we are not there yet. Far from it.
Since you like referencing these standards, you do know that Vesa has HDR standards as low as 400 and 600 nits right?
And I think you are conflating wide gamut vs Dynamic Range. FALD is not needed to achieve wide gamut.
And using HD to illustrate your points exemplifies you don't understand how standards work in broadcast and manufacturing.
"And HLG is again just a standard of how to handle HDR and SDR. It is not required or needed to display HDR images."
The joke was that there are already at least 3 standards of HDR transfer functions, and some (e.g. Dolby Vision) allow for on the fly modification of the transfer function.
"And I think you are conflating wide gamut vs Dynamic Range. FALD is not needed to achieve wide gamut."
Nobody mentioned gamut. High Dynamic Range requires, as the name implies, a high dynamic range. LCD panels cannot achieve that high dynamic range on their own, they need a segmented backlight modulator to do so. As much as marketers would want you to believe otherwise, a straight LCD panel with an edge-lit backlight is not going to provide HDR.
"And using HD to illustrate your points exemplifies you don't understand how standards work in broadcast and manufacturing."
Remember how "HD ready" was brought in to address exactly the same problem of devices marketing capabilities they did not have? And how it brought complaints about allowing 720p devices to also advertise themselves as "HD Ready"? Is this not analogous to the current situation where HDR is being erroneously applied to panels that cannot achieve it, and how VESA's DisplayHDR has complaints that anything below Display HDR1000 is basically worthless?
Peak luminance levels are overblown because they’re easily quantifiable. In reality, if you’ve ever seen a recent LG TV which can hit about 900 nits peak that is too much. https://www.rtings.com/tv/reviews/lg/c8
It’s actually almost painful.
That said I agree oled is the way to go. I wasn’t impressed by any LCD (FALD or not) personally. It doesn’t matter how bright the display gets if it can’t highlight stars on a night sky etc. without significant blooming.
Even 1000 bits is too much for me. The idea of 4000 is absurd. Yes, sunlight is way brighter, but we don’t frequently change scenes from night time to day like television shows do. It’s extremely jarring. Unless you like the feeling of being woken up repeatedly in the middle of the night by a flood light. It’s a hard pass.
I wouldn't be so sure. Not for Gsync, at least. AU Optronics is the only panel producer for monitor sized displays that even gives a flip about pushing lots of high refresh rate options on the market. A 2560x1440 144hz monitor 3 years ago still costs just as much today (if not more, due to upcoming China-to-US import tariffs, starting with 10% on October 1st 2018, and another 15% (total 25%) in January 1st 2019.
High refresh rate GSync isn't set to come down anytime soon, not as long as Nvidia has a stranglehold on GPU market and not as long as AU Optronics is the only panel manufacturer that cares about high refresh rate PC monitor displays.
Japan Display plans to change that in 2019. IIRC Asus is planning to use their displays for a portable Professional OLED monitor.
I would not be surprised they or LG created OLED gaming monitors from Japan Display that's a win-win for gamers, Japan Display, & monitor manufacturers in 2020.
Alternatively they surprise us with MLED monitors that Japan Display also invested in + Samsung & LG.
That's way better to me than any Nano-IPS/QLED monitor. They simply cannot compete.
I would GLADLY pay the premium over the $600-1,000 alternatives IF I thought I was really going to take advantage of what the display offers in the next 2 or even 4 years... But that's the issue. I'm trying to move away from SLI/CF (2x R9 290 atm, about to purchase some sort of 2080), not force myself back into it.
You're gonna need SLI RTX 2080s (Ti or not) to really eke out frame rates fast enough for the refresh rate to matter at 4K, chances are it'll be the same with the next gen of cards unless AMD pulls a rabbit out of a hat and quickly gets a lot more competitive. That's 2-3 years easy where SLI would be a requirement.
HDR support seems to be just as much of a mess... I'll probably just end up with a 32" 4K display (because I'm yearning for something larger than my single 16:10 24" and that approaches the 3x 24" setup I've used at times)... But if I wanted to try a fast refresh rate display I'd just plop down a 27" 1440p 165Hz next to it.
Nate's conclusion is exactly the mental calculus I've been doing, those two displays are still less money than one of these and probably more useful in the long run as secondary displays or hand me down options... As awesome as these G-Sync HDR displays may be, the vendor lock in around G-Sync and active cooling makes em seem like poor investments.
Good displays should last 5+ years easy IMO, I'm not sure these would still be the best solution in 3 years.
Grab yourself an inexpensive 32" 4k display, decent ones are ~$400 these days. I have an LG and it's great all around (I'm a gamer btw), it's not quite high end but it's not a low end display either - it compares pretty favorably to my Dell 27" 2k monitor. I just couldn't see bothering with HDR or any of that other $$$ BS at this point, plus I'm not particularly bothered by screen tearing and I don't demand 100+ FPS from games. Not sure why people are all in a tizzy about super high FPS, as long as the game runs smoothly I am happy.
SLI & Crossfire are succeeded by DX12's & Vulkan's explicit multi-GPU mode. Nvidia deleiberately even ported NVLINK (succeeds classic SLI) to RTX cards from Quadro+ cards but without memory pooling because DX12 & Vulkan already provides that for GPUs.
Devs have to use DX12 or Vulkan and support such features that is easier for them to consider now that Windows 8 mainstream support is over + ray-tracing that's available only on DX12 & Vulkan.
my fw-900 is also dead in my closet, hoping of resurrecting it one day. Right now I got another excellent crt monitor I found to game on: Sony Multiscan E540 : It's not as big, but god damn is it smooth and flawless.
By the time this crt dies, hopefully LCD tech won't be such garbage trying to make workaround for its inferior tech for gaming.
Yeah, I've moved on to a Sony C520K for the last ~2 years. In use I think it's far better than my FW900 was in terms of contrast/color plus I'm able to push slightly higher refresh rates, but it's not widescreen. I have bought a couple expensive G-Sync displays, hoping for an adequate replacement, but ended up returning them. I'm really hoping that this CRT lasts until MicroLED hits the market and that mLED can truly combine the best attributes of LCD and OLED without any of the drawbacks.
I don't get the point. You don't need the features for desktop work, so this is purely a gaming feature. Why not get an equally capable OLED/QLED at a much bigger size for less money ?
TVs don't support native high refresh rates from sources like monitors do (I think LG's does, but only from USB sources or something like that) or adaptive refresh rates. It's a gaming monitor, so it has gaming-specific features.
You answered own question, because its a gaming monitor. You can't find one like this (yet) that offers all the things it does.
You like many people are confused on this website about TV vs monitors. A TV equal size, same resolution, is not the same as a dedicated monitor. A LG OLED 55 inch TV looks pretty bland when you use a PC monitor for gaming.
Because I use the same system for gaming and general desktop use. My main display is my biggest and best monitor and thus used for both. At some hypothetical point in time if I had a pair of high end displays as my both my center and as one of my side displays having different ones as my gaming and desktop use might be an option. But because I'd still be using the other as a secondary display not switching off/absolutely ignoring it, I'd still probably want my main screen to be the center one for both roles so I'd have secondaries to either side; so I'd probably still want the same for both. If I were to end up with both a 4k display and an ultrawide - in which case the best one to game on would vary by title it might become a moot point. (Or I could go 4 screens with small ones on each side and 2 copies of my chat app open I suppose.)
"Why not get an equally capable OLED/QLED at a much bigger size for less money ?"
Because there are no feature equivalent devices.
TVs do not actually accept an update rate of 120Hz, they will operate at 60Hz and either just do double-pulse backlighting or add their own internal interpolation. QLED 'HDR' desktop monitors lack the FALD backlight, so are not HDR monitors (just SDR panels that accept and compress a HDR signal).
A small subset of TVs actually do support native 120Hz inputs, but so far I've only seen that supported at 1080p due to HDMI bandwidth limitations.
For a while it was just a few specific Sony models that supported proper 1080p120 but all the 2017/2018 LG OLEDs do as well as some of the higher end Vizios and a few others.
LG OLED TVs accept 120hz (and actually display true 120 FPS) but at a lower 1080p resolution. They also do 4K/60 of course. Not a great substitute though. If I were spending so much on a monitor I would demand it be oled though. Otherwise I’m spending 1500-1600 more than a 1440p monitor just to get 4K. I mean, cool? But why not go 2000-2500 more and get something actually unique, a 4K 144hz OLED HDR monitor that will be useful for 10 years or so.
This thing will be obsolete the second oled monitors come out. There simply is no comparison.
Regardless of all the technical refresh rate limitations already pointed out, not everyone wants to go that big. 40" is already kinda huge for a desktop display; anything larger takes over the desk, makes it tough to have any side displays, and forces a lot more window management that's just not optimal for people that use their PC for anything but gaming.
I'd rather have a 1440p 165Hz 27" & 4K 32" on moving arms even than a single 4K 50"+ display with a lower DPI than even my old 1920x1200 24"...
To be fair, most would rather have a 4K ultra-wide (LG) or 1440p Ultrawide rather than multiple displays or a TV.
5K is an exception since more room for controls for video work & etc is a good compromise for some to the productive convenience of more horizontal real estate an ultrawide provides.
Most enthusiasts are waiting for HDMI 2.1 to upgrade, so this monitor & current TVs this year are DOA.
This is nearly perfect. Still way overpriced what it is. I'd like to get similar but at 32'' size, 100Hz would be enough, don't need this fancy useless stand with holographic if price can be much cheaper, let it be factory calibrated, good enough for bit of Photoshop and also games. All at $1200 max. Wonder how long we have to wait for something like that.
Forgot to mention it would obviously be 4k. The closest appears to be: BenQ PD3200U but it is only 60Hz monitor and 2017 model. Would want something newer and with 100Hz.
To be fair, a mix of photoshop an also games at that screen res 60Hz would prob be enough since can't push modern games that high for MOST part. I have that Acer Z35P 120Hz monitor, and even with 1080Ti its hard pressed to get lots of games max it out. That is at 3440x1440.
My 2nd "work" monitor next to it is the awesome Dell U3818DW (38 inches) @ 3840x1500 60Hz I actually prefer the dell to strategy games because of size, because FPS is not as huge concern.
But playing Pubg on the Acer 120Hz will get 80-90fps
The Acer Predator 32" has a similar panel as that BenQ and adds G-Sync tho still at a max 60Hz, not as well calibrated out of the box (and with a worse stand and controls) but it has dropped in price a couple times to the same as the BenQ... I've been cross shopping them for a while because 2 grand for a display whose features I may or may not be able to leverage in the next 3 years seems dubious.
I wanted to go 32" too because the 27" 1440p doesn't seem like enough of a jump from my 24" 1920x1200 (being 16:10 it's nearly as tall as the 16:9 27"erd), and I had three of those which we occasionally used in Eyefinity mode (making a ~40" display). I've looked at 40-43" displays but they're all lacking compared to the smaller stuff (newer ones are all VA too, mostly Phillips and one Dell).
I use my PC for photo editing as much as PC gaming but I'm not a pro so a decent IPS screen that I can calibrate reasonably well would satisfy my photo needs.
It is "almost" perfect. It is missing one of the most important things, HDMI 2.1, which has the bandwidth to actually feed the panel with what it is capable of doing (i.e. 4k HDR 4:4:4 120Hz). But we don't have that because this monitor was actually designed 3 years ago and only now finally coming to market, 6 months after HDMI 2.1 was released.
The 35 inch one has been canceled fyi. Asus rep told me when inquired about it just a week ago, unless in a week something has changed. Reason being panel is not perfect yet to mass produce.
That said, its not a big loss, even if disappointing. Because HDR is silly tech so you can skip this generation
I bought one of these, just as they came out. Amazing display performance, but the in-built fan to cool the G-Sync HDR-module killed it for me.
It's one of those noisy 40mm fans, which were otherwise banned from PC setups over a decade ago. It made more noise than the entirety of the rest of my 1080 Ti-SLI system combined. Like a wasp was loose in my room all the time. Completely unbearable to listen to.
I tried to return the monitor as RMA, as I thought that couldn't be right. But it could, said the retailer. At which point I chose to simply return the unit.
In my case, these things will have to wait, till nVidia makes a new G-Sync HDR module, which doesn't require active cooling. Plain and simple. I'm sort of guessing that'll fall in line with the availability of micro-LED displays. Which will hopefully also be much cheaper, than the ridiculously expensive FALD-panels in these monitors.
Aye. The FALD array puts out plenty of heat, but it's distributed, so it can be dissipated over a large area. The FPGA for controlling G-Sync HDR is generates much less heat, but it's concentrated. So passive cooling would seem to be non-viable here.
Yeah, nVidia's DP1.4 VRR solution is baffelingly poor/non-competitive, not just due to the requirement for active cooling.
nVidia's DP1.4 g-sync module is speculated to contribute a lot to the monitor's price (FPGA alone is estimated to be ~ $500). If true, I just don't see how g-sync isn't on a path towards extinction. That simply isn't a price premium over FreeSync that the consumer market will accept.
If g-sync isn't at least somewhat widespread and (via customer lock in) helping nVidia sell more g-sync enabled GPUs, then g-sync also isn't serving any role for nVidia. They might as well drop it and go with VESA's VRR standard.
So, although I'm actually thinking of shelling out $2000 for a monitor, I don't want to invest in technology it seems has priced itself out of the market and is bound to become irrelevant.
Maybe you could shed some light on where nVidia is going with their latest g-sync solution? At least for now it doesn't seem viable.
How would anyone outside of NV know where they're going with this tho? I imagine it does help sell more hardware to one extent or another (be it GPUs, FPGAs to display makers, or a combination of profits thru the side deals) AND they'll stay the course as long as AMD isn't competitive at the high end...
Just the sad reality. I just bought a G-Sync display but it wasn't one of these or even $1K, and it's still a nice display regardless of whether it has G-Sync or not. I don't intend to pay this kinda premium without a clear path forward either but I guess plenty of people are or both Acer and Asus wouldn't be selling this and plenty of other G-Sync displays with a premium over the Freesync ones.
"How would anyone outside of NV know where they're going with this tho?"
Anandtech could talk with their contacts at nVidia, discuss the situation with monitor OEMs, or take any one of a dozen other approaches. Anandtech does a lot of good market research and analysis. There is no reason they can't do that here too. If Anandtech confronted nVidia with the concern of DP1.4 g-sync being priced into irrelevancy, they would surely get some response.
"I don't intend to pay this kinda premium without a clear path forward either but I guess plenty of people are or both Acer and Asus wouldn't be selling this and plenty of other G-Sync displays with a premium over the Freesync ones."
You're mistakenly assuming the DP1.2 g-sync is in any way comparable to DP1.4 g-sync. It's not.
First, nobody sells plenty of g-sync monitors. The $200 price premium over FreeSync has made g-sync monitors (comparatively) low volume niche products. For DP1.4 that premium goes up to over $500. There is no way that will fly in a market where the entire product typically sells for less than $500. This is made worse by the fact that ONLY DP1.4 supports HDR. That means even a measly DisplayHDR 400 monitor, which will soon retail for around $400, will cost at least $900 if you want it with g-sync.
Almost nobody, for whom price is even a little bit of an issue, will pay that.
While DP1.2 g-sync monitors were niche products, DP1.4 g-sync monitors will be irrelevant products (in terms of market penetration). Acer's and Asus' $2000 monitors aren't and will not sell in significant numbers. Nothing using nVidia's DP1.4 g-sync module will.
To be clear, this isn't a rant about price. It's a rant about strategy. The whole point of g-sync is customer lock-in. Nobody, not even nVidia, earns anything selling g-sync hardware. For nVidia, the potential of g-sync is only realized when a person with a g-sync monitor upgrades to a new nVidia card who would otherwise have bought an AMD card. If DP1.4 g-sync isn't adopted in at least somewhat meaningful numbers, g-sync loses its purpose. That is when I'd expect nVidia to either trash g-sync and start supporting FreeSync, OR build a better g-sync module without the insanely expensive FPGA.
Neither of those two scenarios motivates me to buy a $2000 g-sync monitor today. That's the problem.
If I'm spending $2000 on a g-sync monitor today, I'd like some reassurance that g-sync will still be relevant and supported three years from now.
For the reasons mentioned, from where I stand, g-sync looks like "dead technology walking". With DP1.4 it's priced itself out of the market. I'm sure many would appreciate some background on where nVidia is going with this...
Nvidia's solution is objectively better besides not being open. Similarly NVLINK is better than any other multi-GPU hardware wise.
With HDMI 2.1, Nvidia will likely support it unless it's simply underwhelming.
Once standards catch up, Nvidia hasn't been afraid to deprecate their own previous effort somewhat besides continuing to support it for wide-spread support / loyalty or a balanced approach (i.e. NVLINK for Geforce cards but delegate memory pooling to DX12 & Vulkan)
If NVidia started supporting standard adaptive sync at the same time that would be great... Pipe dream I know. Things like G-Sync vs Freesync, fans inside displays, and dubious HDR support don't inspire much confidence in these new displays. I'd gladly drop the two grand if I *knew* this was the way forward and would easily last me 5+ years, but I dunno if that would really pan out.
I'm pretty disappointed that a gaming monitor with this price still has only 8 bits of native color resolution (plus FRC, I know).
Compare this to the ASUS PA32UC which – while not mainly targetted at gamers – has 10 bits, no fan noise, is 5 inches bigger (32" total) and many more inputs (including USB-C DP). For about the same price.
Any game that supports HDR uses 10 bpp natively. In fact, many games use 10 bpp internally even if they don't support HDR officially.
That's why a HDR monitor must support the HDR10 video signal (that's the only way to get the 10 bpp frame from the GPU to the monitor).
OTOH, a 10 bit panel for gaming typically won't provide a perceptible improvement. In practice, 8bit+FRC is just as good. IMHO it's only for editing HDR still imagery where real 10bit panels provide benefits.
I have to wonder if 8-bit+FRC makes sense on the client side for niche situations like this, where the bandwidth is insufficient to have full resolution *and* colour depth *and* refresh rate at once?
You run the risk of banding or flicker, but frankly that's similar for display FRC, and I imagine if the screen was aware of what was happening it might be able to smooth it out. It'd essentially improve the refresh rate of the at the expense of some precise accuracy. Which some gamers might well be willing to take. Of course that's all moot if the card can't even play the game at the target refresh rate.
By client, of course, I mean card - it would send an 8-bit signal within the HDR colour gamut and the result would be a frequency-interpolated output hopefully similar to that possible now - but by restricting at the graphics card end you use less bandwidth, and hopefully it doesn't take too much power.
"I have to wonder if 8-bit+FRC makes sense on the client side for niche situations like this"
It's an interesting idea, but I don't think it can work.
The core problem is that the monitor then has no way of knowing if in such an FRC'ed image, a bright pixel next to a darker pixel correctly describes the desired content, or if it's just an FRC artifact.
Two neighboring pixels of varying luminance affect everything from how to control the individual LEDs in a FALD backlight, to when and how strongly to overdrive pixels to reduce motion blur. You can't do these things in the same way (or at all) if the luminance delta is merely an FRC artifact.
As a result, the GPU would have to control everything that is currently handled by the monitor's controller + firmware, because only it has access to the original 10 bpp image. That would be counter productive, because then you'd also have to transport all the signaling information (for the monitor's backlighting and pixels) from the GPU to the monitor, which would require far more bandwidth than the 2 bpp you set out to save 😕
What you're thinking about is essentially a compression scheme to save bandwidth. Even if it did work, employing FRC in this way is lossy and nets you, at best, a 20% bandwidth reduction.
However, the DP1.4(a) standard already defines a compression scheme. DSC is lossless and nets you about 30%.That would be the way to do what you're thinking of.
Particularly 4k DP1.4 gaming monitors are in dire need of this. That nVidia and Acer/Asus would implement chroma subsampling 4:2:2 (which is also a lossy compression scheme) rather than DSC is shameful. 😳
I wonder if nVidia's newest $500+ g-sync module is even capable of DSC. I suspect it is not.
DSC is not lossless, it's "visually lossless", which means that most of the time you shouldn't percieve a difference compared to an uncompressed stream. I'll reserve my judgement until I see some implementations.
That Asus PA32UC wouldn't get you G-Sync or refresh rates over 60Hz and it's still $975 tho... It sucks that the display market is so fractured and people who use their PCs for gaming as well as content creation can't get anything approaching perfect or even ideal at times.
There's a few 4K 32" displays with G-Sync or Freesync but they don't go past 60-95Hz AFAIK and then you don't get HDR, it's all a compromise, and has been for years due to competing adaptive sync standards, lagging connection standards, a lagging GPU market, etc etc.
Eventually, but not soon. AUO is the only panel company working on 4k/high refresh/HDR; and they don't have anything with more dimming zones on their public road map (which is nominally about a year out for their production, add a few months to it for monitors makers to package them and get them to retail up once they start volume production of panels).
AUO have stated it lands this fall so should be very soon. They made it sound like they will have a shipping monitor by the end of 2018 albeit who really knows but im sure its under 1 year away at this point.
Can Google: "AUO Expects to Launch Mini LED Gaming Monitor in 2H18"
384 zones is just CRAP, you only find that number of zones on low end cheapo TV's with FALD just o be a bit more "premium". For that price is should have 1000 AT LEAST.
Seems we will need to wait for LCD with minileds to actually start seeing monitors with 5000+zones.
Consoles started to push that 4K / HDR nonsense and now the monopoly provides a monitor to match for the more money than sense crowd. The obscure but sensible strobing backlight / ULMB got sacrificed for the blasted buzzwords and Gsync. Is it because the panel is barely fast enough for Gsync or is it a general shift in direction, doubling down on proprietary G-stink and the ridiculously superfluous 4K native. Is it because with failing VR, high frame rates are off the table completely? Is there any mention on how 1920x1080 looks on that monitor (too bad), because the pixel density is decidedly useless and non standard. But scaled down to half it could be 81.5 ppi and this thing can actually be used to read text.
"the most desired and visible aspects of modern gaming monitors: ultra high resolution (4K)" No it's not. At least on Windows where UI scaling still sucks. At least on "slow" graphics card like 2080 Ti where 4K doesn't run 144fps. And 4K monitors can't do 1440p natively, so a huge deal breaker.
No I don't care about scaling in games, but I do care about 144fps in games. Only possible in 4K with SLI 2080 Ti and good game SLI support. Plenty of games don't. Also plenty of 3rd party apps not correcting Windows bullshit, and I gain no extra working space if I do scale. Simply too many downsides and too little benefit.
I do photo editing, and would like to start doing 10 bit HDR video work (as a passionate hobby, not a profession) But also would like a GSYNC monitor for gaming.
Would this or the upcoming 35 model be a good fit? I can't tell from the review if the color space support is adequate for the editing work I do.
That's really bad as there is no any product which support DP 2.0 so we stucked with DP 1.4 with fake 144Hz So all monitors and TVs are obsolete... Only laptops which has 144 or 240 or 300Hz has real refresh rate as there is no any HDMi or DP but nobody talking about eDP (embeded display port) which is in laptops and which is in reality supported up to 120Hz so how on earth they making laptops with 144Hz or 240 or even 300Hz ???
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
91 Comments
Back to Article
Flunk - Tuesday, October 2, 2018 - link
I'd really like one of these, but I can't really justify $2000 because I know that in 6-months to a year competition will arrive that severely undercuts this price.imaheadcase - Tuesday, October 2, 2018 - link
That's just technology in general. But keep a eye out, around that time this monitor is coming out with a revision that will remove the "gaming" features" but still maintain refresh rate and size.edzieba - Tuesday, October 2, 2018 - link
The big omission to watch out for is the FALD backlight. Without that, HDR cannot be achieved outside of an OLED panel (and even then OLED cannot yet meet the peak luminance levels). You;re going to see a lot of monitors that are effectively SDR panels with the brightness turned up, and sold as 'HDR'. If you're old enough to remember when HDTV was rolling uout, remember the wave of budget 'HD' TVs that used SD panels but accepted and downsampled HD inputs? Same situation here.Hixbot - Tuesday, October 2, 2018 - link
Pretty sure edgelit displays can hit the higher gamut by using a quantom dot filter.DanNeely - Tuesday, October 2, 2018 - link
quantum dots increase the color gamut, HDR is about increasing the luminescence range on screen at any time. Edge lit displays only have a handful of dimming zones at most (no way to get more when your control consists of only 1 configurable value per row/column). You need back lighting where each small chunk of the screen can be controlled independently to get anything approaching a decent result. Per pixel is best, but only doable with OLED or jumbotron size displays. (MicroLED - we can barely make normal LEDs small enough for this scale.) OTOH if costs can be brought down microLED does have the potential to power a FALD backlight with an order of magnitude or more more dimming zones than current models LCD can do; enough to largely make halo effects around bright objects a negligible issue.Lolimaster - Tuesday, October 2, 2018 - link
There is also miniled that will replace regular led for the backlight.Microled = OLED competition
Miniled up to 50,000zones (cheap "premium phones" will come with 48zones).
crimsonson - Tuesday, October 2, 2018 - link
I think you are exaggerating a bit. HDR is just a transform function. There are several standards that say what the peak luminance should be to considered HDR10 or Dolby Vision etc. But that itself is misleading.Define " (and even then OLED cannot yet meet the peak luminance levels)"
Because OLED can def reach 600+ nits, which is one of the standards for HDR being proposed.
edzieba - Tuesday, October 2, 2018 - link
"HDR is just a transform function"Just A transform function? [Laughs in Hybrid Log Gamma],
Joking aside, HDR is also a set of minimum requirements. Claiming panels that do not even come close to meeting those requirements are also HDR is akin to claiming that 720x468 is HD, because "it's just a resolution". The requirements range far beyond just peak luminance levels, which is why merely slapping a big-ass backlight to a panel and claiming it is 'HDR' is nonsense.
crimsonson - Wednesday, October 3, 2018 - link
"Just A transform function? [Laughs in Hybrid Log Gamma],"
And HLG is again just a standard of how to handle HDR and SDR. It is not required or needed to display HDR images.
"HDR is also a set of minimum requirements"
No, there are STANDARDS that attempts to address HDR features across products and in video production. But in itself does not mean violating those standards equate to a non-HDR image. Dolby Vision, for example, supports dynamic metadata. HDR10 does not. Does that make HDR10 NOT HDR?
Eventually, the market and the industry to congregate behind 1 or 2 SET of standards (since it is not only about 1 number or feature). But we are not there yet. Far from it.
Since you like referencing these standards, you do know that Vesa has HDR standards as low as 400 and 600 nits right?
And I think you are conflating wide gamut vs Dynamic Range. FALD is not needed to achieve wide gamut.
And using HD to illustrate your points exemplifies you don't understand how standards work in broadcast and manufacturing.
edzieba - Thursday, October 4, 2018 - link
"And HLG is again just a standard of how to handle HDR and SDR. It is not required or needed to display HDR images."The joke was that there are already at least 3 standards of HDR transfer functions, and some (e.g. Dolby Vision) allow for on the fly modification of the transfer function.
"And I think you are conflating wide gamut vs Dynamic Range. FALD is not needed to achieve wide gamut."
Nobody mentioned gamut. High Dynamic Range requires, as the name implies, a high dynamic range. LCD panels cannot achieve that high dynamic range on their own, they need a segmented backlight modulator to do so.
As much as marketers would want you to believe otherwise, a straight LCD panel with an edge-lit backlight is not going to provide HDR.
"And using HD to illustrate your points exemplifies you don't understand how standards work in broadcast and manufacturing."
Remember how "HD ready" was brought in to address exactly the same problem of devices marketing capabilities they did not have? And how it brought complaints about allowing 720p devices to also advertise themselves as "HD Ready"? Is this not analogous to the current situation where HDR is being erroneously applied to panels that cannot achieve it, and how VESA's DisplayHDR has complaints that anything below Display HDR1000 is basically worthless?
lilkwarrior - Monday, October 8, 2018 - link
OLED isn't covered by VESA HDR standards; it's far superior picture quality & contrast.QLED cannot compete with OLED at all in such things. I would very much get a Dolby Vision OLED monitor than a LED monitor with a HDR 1000 rating.
Lolimaster - Tuesday, October 2, 2018 - link
You can't even call HDR with a pathetic low contrast IPS.resiroth - Monday, October 8, 2018 - link
Peak luminance levels are overblown because they’re easily quantifiable. In reality, if you’ve ever seen a recent LG TV which can hit about 900 nits peak that is too much. https://www.rtings.com/tv/reviews/lg/c8It’s actually almost painful.
That said I agree oled is the way to go. I wasn’t impressed by any LCD (FALD or not) personally. It doesn’t matter how bright the display gets if it can’t highlight stars on a night sky etc. without significant blooming.
Even 1000 bits is too much for me. The idea of 4000 is absurd. Yes, sunlight is way brighter, but we don’t frequently change scenes from night time to day like television shows do. It’s extremely jarring. Unless you like the feeling of being woken up repeatedly in the middle of the night by a flood light. It’s a hard pass.
Hxx - Saturday, October 6, 2018 - link
the only competition is Acer which costs the same. If you want Gsync you have to pony up otherwise yeah there are much cheaper alternatives.Hixbot - Tuesday, October 2, 2018 - link
Careful with this one, the "whistles" in the article title is referring to the built-in fan whine. Seriously, look at the newegg reviews.JoeyJoJo123 - Tuesday, October 2, 2018 - link
"because I know"I wouldn't be so sure. Not for Gsync, at least. AU Optronics is the only panel producer for monitor sized displays that even gives a flip about pushing lots of high refresh rate options on the market. A 2560x1440 144hz monitor 3 years ago still costs just as much today (if not more, due to upcoming China-to-US import tariffs, starting with 10% on October 1st 2018, and another 15% (total 25%) in January 1st 2019.
High refresh rate GSync isn't set to come down anytime soon, not as long as Nvidia has a stranglehold on GPU market and not as long as AU Optronics is the only panel manufacturer that cares about high refresh rate PC monitor displays.
lilkwarrior - Monday, October 8, 2018 - link
Japan Display plans to change that in 2019. IIRC Asus is planning to use their displays for a portable Professional OLED monitor.I would not be surprised they or LG created OLED gaming monitors from Japan Display that's a win-win for gamers, Japan Display, & monitor manufacturers in 2020.
Alternatively they surprise us with MLED monitors that Japan Display also invested in + Samsung & LG.
That's way better to me than any Nano-IPS/QLED monitor. They simply cannot compete.
Impulses - Tuesday, October 2, 2018 - link
I would GLADLY pay the premium over the $600-1,000 alternatives IF I thought I was really going to take advantage of what the display offers in the next 2 or even 4 years... But that's the issue. I'm trying to move away from SLI/CF (2x R9 290 atm, about to purchase some sort of 2080), not force myself back into it.You're gonna need SLI RTX 2080s (Ti or not) to really eke out frame rates fast enough for the refresh rate to matter at 4K, chances are it'll be the same with the next gen of cards unless AMD pulls a rabbit out of a hat and quickly gets a lot more competitive. That's 2-3 years easy where SLI would be a requirement.
HDR support seems to be just as much of a mess... I'll probably just end up with a 32" 4K display (because I'm yearning for something larger than my single 16:10 24" and that approaches the 3x 24" setup I've used at times)... But if I wanted to try a fast refresh rate display I'd just plop down a 27" 1440p 165Hz next to it.
Nate's conclusion is exactly the mental calculus I've been doing, those two displays are still less money than one of these and probably more useful in the long run as secondary displays or hand me down options... As awesome as these G-Sync HDR displays may be, the vendor lock in around G-Sync and active cooling makes em seem like poor investments.
Good displays should last 5+ years easy IMO, I'm not sure these would still be the best solution in 3 years.
Icehawk - Wednesday, October 3, 2018 - link
Grab yourself an inexpensive 32" 4k display, decent ones are ~$400 these days. I have an LG and it's great all around (I'm a gamer btw), it's not quite high end but it's not a low end display either - it compares pretty favorably to my Dell 27" 2k monitor. I just couldn't see bothering with HDR or any of that other $$$ BS at this point, plus I'm not particularly bothered by screen tearing and I don't demand 100+ FPS from games. Not sure why people are all in a tizzy about super high FPS, as long as the game runs smoothly I am happy.WasHopingForAnHonestReview - Saturday, October 6, 2018 - link
You dont belong here, plebian.lilkwarrior - Monday, October 8, 2018 - link
SLI & Crossfire are succeeded by DX12's & Vulkan's explicit multi-GPU mode. Nvidia deleiberately even ported NVLINK (succeeds classic SLI) to RTX cards from Quadro+ cards but without memory pooling because DX12 & Vulkan already provides that for GPUs.Devs have to use DX12 or Vulkan and support such features that is easier for them to consider now that Windows 8 mainstream support is over + ray-tracing that's available only on DX12 & Vulkan.
nathanddrews - Wednesday, October 3, 2018 - link
Still cheaper than my Sony FW-900 CRT was when it was brand new! LOLHixbot - Wednesday, October 3, 2018 - link
Still not better than a fw-900 in many ways. This LCD doesn't have a strobing feature to reduce eye tracking motion blur.nathanddrews - Wednesday, October 3, 2018 - link
No argument there, but my FW900 died, so the options are few...Crazyeyeskillah - Wednesday, October 3, 2018 - link
my fw-900 is also dead in my closet, hoping of resurrecting it one day. Right now I got another excellent crt monitor I found to game on: Sony Multiscan E540 : It's not as big, but god damn is it smooth and flawless.By the time this crt dies, hopefully LCD tech won't be such garbage trying to make workaround for its inferior tech for gaming.
nathanddrews - Thursday, October 4, 2018 - link
Yeah, I've moved on to a Sony C520K for the last ~2 years. In use I think it's far better than my FW900 was in terms of contrast/color plus I'm able to push slightly higher refresh rates, but it's not widescreen. I have bought a couple expensive G-Sync displays, hoping for an adequate replacement, but ended up returning them. I'm really hoping that this CRT lasts until MicroLED hits the market and that mLED can truly combine the best attributes of LCD and OLED without any of the drawbacks.Ironchef3500 - Wednesday, October 3, 2018 - link
100%Tunnah - Tuesday, October 2, 2018 - link
I don't get the point. You don't need the features for desktop work, so this is purely a gaming feature. Why not get an equally capable OLED/QLED at a much bigger size for less money ?Inteli - Tuesday, October 2, 2018 - link
TVs don't support native high refresh rates from sources like monitors do (I think LG's does, but only from USB sources or something like that) or adaptive refresh rates. It's a gaming monitor, so it has gaming-specific features.imaheadcase - Tuesday, October 2, 2018 - link
You answered own question, because its a gaming monitor. You can't find one like this (yet) that offers all the things it does.You like many people are confused on this website about TV vs monitors. A TV equal size, same resolution, is not the same as a dedicated monitor. A LG OLED 55 inch TV looks pretty bland when you use a PC monitor for gaming.
DanNeely - Tuesday, October 2, 2018 - link
Because I use the same system for gaming and general desktop use. My main display is my biggest and best monitor and thus used for both. At some hypothetical point in time if I had a pair of high end displays as my both my center and as one of my side displays having different ones as my gaming and desktop use might be an option. But because I'd still be using the other as a secondary display not switching off/absolutely ignoring it, I'd still probably want my main screen to be the center one for both roles so I'd have secondaries to either side; so I'd probably still want the same for both. If I were to end up with both a 4k display and an ultrawide - in which case the best one to game on would vary by title it might become a moot point. (Or I could go 4 screens with small ones on each side and 2 copies of my chat app open I suppose.)Impulses - Tuesday, October 2, 2018 - link
Still using the 32" Predator?edzieba - Tuesday, October 2, 2018 - link
"Why not get an equally capable OLED/QLED at a much bigger size for less money ?"Because there are no feature equivalent devices.
TVs do not actually accept an update rate of 120Hz, they will operate at 60Hz and either just do double-pulse backlighting or add their own internal interpolation. QLED 'HDR' desktop monitors lack the FALD backlight, so are not HDR monitors (just SDR panels that accept and compress a HDR signal).
wolrah - Tuesday, October 2, 2018 - link
A small subset of TVs actually do support native 120Hz inputs, but so far I've only seen that supported at 1080p due to HDMI bandwidth limitations.For a while it was just a few specific Sony models that supported proper 1080p120 but all the 2017/2018 LG OLEDs do as well as some of the higher end Vizios and a few others.
resiroth - Monday, October 8, 2018 - link
LG OLED TVs accept 120hz (and actually display true 120 FPS) but at a lower 1080p resolution. They also do 4K/60 of course. Not a great substitute though. If I were spending so much on a monitor I would demand it be oled though. Otherwise I’m spending 1500-1600 more than a 1440p monitor just to get 4K. I mean, cool? But why not go 2000-2500 more and get something actually unique, a 4K 144hz OLED HDR monitor that will be useful for 10 years or so.This thing will be obsolete the second oled monitors come out. There simply is no comparison.
Impulses - Tuesday, October 2, 2018 - link
Regardless of all the technical refresh rate limitations already pointed out, not everyone wants to go that big. 40" is already kinda huge for a desktop display; anything larger takes over the desk, makes it tough to have any side displays, and forces a lot more window management that's just not optimal for people that use their PC for anything but gaming.I'd rather have a 1440p 165Hz 27" & 4K 32" on moving arms even than a single 4K 50"+ display with a lower DPI than even my old 1920x1200 24"...
lilkwarrior - Monday, October 8, 2018 - link
To be fair, most would rather have a 4K ultra-wide (LG) or 1440p Ultrawide rather than multiple displays or a TV.5K is an exception since more room for controls for video work & etc is a good compromise for some to the productive convenience of more horizontal real estate an ultrawide provides.
Most enthusiasts are waiting for HDMI 2.1 to upgrade, so this monitor & current TVs this year are DOA.
milkod2001 - Tuesday, October 2, 2018 - link
This is nearly perfect. Still way overpriced what it is. I'd like to get similar but at 32'' size, 100Hz would be enough, don't need this fancy useless stand with holographic if price can be much cheaper, let it be factory calibrated, good enough for bit of Photoshop and also games. All at $1200 max. Wonder how long we have to wait for something like that.milkod2001 - Tuesday, October 2, 2018 - link
Forgot to mention it would obviously be 4k. The closest appears to be: BenQ PD3200U but it is only 60Hz monitor and 2017 model. Would want something newer and with 100Hz.imaheadcase - Tuesday, October 2, 2018 - link
To be fair, a mix of photoshop an also games at that screen res 60Hz would prob be enough since can't push modern games that high for MOST part. I have that Acer Z35P 120Hz monitor, and even with 1080Ti its hard pressed to get lots of games max it out. That is at 3440x1440.My 2nd "work" monitor next to it is the awesome Dell U3818DW (38 inches) @ 3840x1500 60Hz I actually prefer the dell to strategy games because of size, because FPS is not as huge concern.
But playing Pubg on the Acer 120Hz will get 80-90fps
imaheadcase - Tuesday, October 2, 2018 - link
3840x1600 is the dell i mean.Impulses - Tuesday, October 2, 2018 - link
The Acer Predator 32" has a similar panel as that BenQ and adds G-Sync tho still at a max 60Hz, not as well calibrated out of the box (and with a worse stand and controls) but it has dropped in price a couple times to the same as the BenQ... I've been cross shopping them for a while because 2 grand for a display whose features I may or may not be able to leverage in the next 3 years seems dubious.I wanted to go 32" too because the 27" 1440p doesn't seem like enough of a jump from my 24" 1920x1200 (being 16:10 it's nearly as tall as the 16:9 27"erd), and I had three of those which we occasionally used in Eyefinity mode (making a ~40" display). I've looked at 40-43" displays but they're all lacking compared to the smaller stuff (newer ones are all VA too, mostly Phillips and one Dell).
I use my PC for photo editing as much as PC gaming but I'm not a pro so a decent IPS screen that I can calibrate reasonably well would satisfy my photo needs.
Fallen Kell - Tuesday, October 2, 2018 - link
It is "almost" perfect. It is missing one of the most important things, HDMI 2.1, which has the bandwidth to actually feed the panel with what it is capable of doing (i.e. 4k HDR 4:4:4 120Hz). But we don't have that because this monitor was actually designed 3 years ago and only now finally coming to market, 6 months after HDMI 2.1 was released.lilkwarrior - Monday, October 8, 2018 - link
HDMI 2.1 certification is still not done; it would not have been able to call itself a HDMI 2.1 till probably late this year or next year.imaheadcase - Tuesday, October 2, 2018 - link
The 35 inch one has been canceled fyi. Asus rep told me when inquired about it just a week ago, unless in a week something has changed. Reason being panel is not perfect yet to mass produce.That said, its not a big loss, even if disappointing. Because HDR is silly tech so you can skip this generation
EAlbaek - Tuesday, October 2, 2018 - link
I bought one of these, just as they came out. Amazing display performance, but the in-built fan to cool the G-Sync HDR-module killed it for me.It's one of those noisy 40mm fans, which were otherwise banned from PC setups over a decade ago. It made more noise than the entirety of the rest of my 1080 Ti-SLI system combined. Like a wasp was loose in my room all the time. Completely unbearable to listen to.
I tried to return the monitor as RMA, as I thought that couldn't be right. But it could, said the retailer. At which point I chose to simply return the unit.
In my case, these things will have to wait, till nVidia makes a new G-Sync HDR module, which doesn't require active cooling. Plain and simple. I'm sort of guessing that'll fall in line with the availability of micro-LED displays. Which will hopefully also be much cheaper, than the ridiculously expensive FALD-panels in these monitors.
imaheadcase - Tuesday, October 2, 2018 - link
Can't you just replace the fan yourself? I read around the time of release someone simply removed fan and put own silent version on it.EAlbaek - Tuesday, October 2, 2018 - link
No idea - I shouldn't have to void the warranty on my $2000 monitor, to replace a 40mm fan.madwolfa - Tuesday, October 2, 2018 - link
Is that G-Sync HDR that requires active cooling or FALD array?EAlbaek - Tuesday, October 2, 2018 - link
It's the G-Sync HDR chip, apparantly.Ryan Smith - Wednesday, October 3, 2018 - link
Aye. The FALD array puts out plenty of heat, but it's distributed, so it can be dissipated over a large area. The FPGA for controlling G-Sync HDR is generates much less heat, but it's concentrated. So passive cooling would seem to be non-viable here.a5cent - Wednesday, October 3, 2018 - link
Yeah, nVidia's DP1.4 VRR solution is baffelingly poor/non-competitive, not just due to the requirement for active cooling.nVidia's DP1.4 g-sync module is speculated to contribute a lot to the monitor's price (FPGA alone is estimated to be ~ $500). If true, I just don't see how g-sync isn't on a path towards extinction. That simply isn't a price premium over FreeSync that the consumer market will accept.
If g-sync isn't at least somewhat widespread and (via customer lock in) helping nVidia sell more g-sync enabled GPUs, then g-sync also isn't serving any role for nVidia. They might as well drop it and go with VESA's VRR standard.
So, although I'm actually thinking of shelling out $2000 for a monitor, I don't want to invest in technology it seems has priced itself out of the market and is bound to become irrelevant.
Maybe you could shed some light on where nVidia is going with their latest g-sync solution? At least for now it doesn't seem viable.
Impulses - Wednesday, October 3, 2018 - link
How would anyone outside of NV know where they're going with this tho? I imagine it does help sell more hardware to one extent or another (be it GPUs, FPGAs to display makers, or a combination of profits thru the side deals) AND they'll stay the course as long as AMD isn't competitive at the high end...Just the sad reality. I just bought a G-Sync display but it wasn't one of these or even $1K, and it's still a nice display regardless of whether it has G-Sync or not. I don't intend to pay this kinda premium without a clear path forward either but I guess plenty of people are or both Acer and Asus wouldn't be selling this and plenty of other G-Sync displays with a premium over the Freesync ones.
a5cent - Wednesday, October 3, 2018 - link
"How would anyone outside of NV know where they're going with this tho?"Anandtech could talk with their contacts at nVidia, discuss the situation with monitor OEMs, or take any one of a dozen other approaches. Anandtech does a lot of good market research and analysis. There is no reason they can't do that here too. If Anandtech confronted nVidia with the concern of DP1.4 g-sync being priced into irrelevancy, they would surely get some response.
"I don't intend to pay this kinda premium without a clear path forward either but I guess plenty of people are or both Acer and Asus wouldn't be selling this and plenty of other G-Sync displays with a premium over the Freesync ones."
You're mistakenly assuming the DP1.2 g-sync is in any way comparable to DP1.4 g-sync. It's not.
First, nobody sells plenty of g-sync monitors. The $200 price premium over FreeSync has made g-sync monitors (comparatively) low volume niche products. For DP1.4 that premium goes up to over $500. There is no way that will fly in a market where the entire product typically sells for less than $500. This is made worse by the fact that ONLY DP1.4 supports HDR. That means even a measly DisplayHDR 400 monitor, which will soon retail for around $400, will cost at least $900 if you want it with g-sync.
Almost nobody, for whom price is even a little bit of an issue, will pay that.
While DP1.2 g-sync monitors were niche products, DP1.4 g-sync monitors will be irrelevant products (in terms of market penetration). Acer's and Asus' $2000 monitors aren't and will not sell in significant numbers. Nothing using nVidia's DP1.4 g-sync module will.
To be clear, this isn't a rant about price. It's a rant about strategy. The whole point of g-sync is customer lock-in. Nobody, not even nVidia, earns anything selling g-sync hardware. For nVidia, the potential of g-sync is only realized when a person with a g-sync monitor upgrades to a new nVidia card who would otherwise have bought an AMD card. If DP1.4 g-sync isn't adopted in at least somewhat meaningful numbers, g-sync loses its purpose. That is when I'd expect nVidia to either trash g-sync and start supporting FreeSync, OR build a better g-sync module without the insanely expensive FPGA.
Neither of those two scenarios motivates me to buy a $2000 g-sync monitor today. That's the problem.
a5cent - Wednesday, October 3, 2018 - link
To clarify the above...If I'm spending $2000 on a g-sync monitor today, I'd like some reassurance that g-sync will still be relevant and supported three years from now.
For the reasons mentioned, from where I stand, g-sync looks like "dead technology walking". With DP1.4 it's priced itself out of the market. I'm sure many would appreciate some background on where nVidia is going with this...
lilkwarrior - Monday, October 8, 2018 - link
Nvidia's solution is objectively better besides not being open. Similarly NVLINK is better than any other multi-GPU hardware wise.With HDMI 2.1, Nvidia will likely support it unless it's simply underwhelming.
Once standards catch up, Nvidia hasn't been afraid to deprecate their own previous effort somewhat besides continuing to support it for wide-spread support / loyalty or a balanced approach (i.e. NVLINK for Geforce cards but delegate memory pooling to DX12 & Vulkan)
Impulses - Tuesday, October 2, 2018 - link
If NVidia started supporting standard adaptive sync at the same time that would be great... Pipe dream I know. Things like G-Sync vs Freesync, fans inside displays, and dubious HDR support don't inspire much confidence in these new displays. I'd gladly drop the two grand if I *knew* this was the way forward and would easily last me 5+ years, but I dunno if that would really pan out.DanNeely - Tuesday, October 2, 2018 - link
Thank you for including the explanation on why DSC hasn't shown up in any products to date.Heavenly71 - Tuesday, October 2, 2018 - link
I'm pretty disappointed that a gaming monitor with this price still has only 8 bits of native color resolution (plus FRC, I know).Compare this to the ASUS PA32UC which – while not mainly targetted at gamers – has 10 bits, no fan noise, is 5 inches bigger (32" total) and many more inputs (including USB-C DP). For about the same price.
milkod2001 - Tuesday, October 2, 2018 - link
Wonder if they make native 10bit monitors. Would you be able to output 10bit colours from gaming GPU or only professional GPU?crimsonson - Tuesday, October 2, 2018 - link
Someone can correct me, but AFAIK there are no native 10 bit RGB support in games. 10-bit panel would at least improve its HDR capabilities.FreckledTrout - Tuesday, October 2, 2018 - link
The games that say they are HDR should be using 10-bit color as well.a5cent - Wednesday, October 3, 2018 - link
Any game that supports HDR uses 10 bpp natively. In fact, many games use 10 bpp internally even if they don't support HDR officially.That's why a HDR monitor must support the HDR10 video signal (that's the only way to get the 10 bpp frame from the GPU to the monitor).
OTOH, a 10 bit panel for gaming typically won't provide a perceptible improvement. In practice, 8bit+FRC is just as good. IMHO it's only for editing HDR still imagery where real 10bit panels provide benefits.
GreenReaper - Thursday, October 4, 2018 - link
I have to wonder if 8-bit+FRC makes sense on the client side for niche situations like this, where the bandwidth is insufficient to have full resolution *and* colour depth *and* refresh rate at once?You run the risk of banding or flicker, but frankly that's similar for display FRC, and I imagine if the screen was aware of what was happening it might be able to smooth it out. It'd essentially improve the refresh rate of the at the expense of some precise accuracy. Which some gamers might well be willing to take. Of course that's all moot if the card can't even play the game at the target refresh rate.
GreenReaper - Thursday, October 4, 2018 - link
By client, of course, I mean card - it would send an 8-bit signal within the HDR colour gamut and the result would be a frequency-interpolated output hopefully similar to that possible now - but by restricting at the graphics card end you use less bandwidth, and hopefully it doesn't take too much power.a5cent - Thursday, October 4, 2018 - link
"I have to wonder if 8-bit+FRC makes sense on the client side for niche situations like this"It's an interesting idea, but I don't think it can work.
The core problem is that the monitor then has no way of knowing if in such an FRC'ed image, a bright pixel next to a darker pixel correctly describes the desired content, or if it's just an FRC artifact.
Two neighboring pixels of varying luminance affect everything from how to control the individual LEDs in a FALD backlight, to when and how strongly to overdrive pixels to reduce motion blur. You can't do these things in the same way (or at all) if the luminance delta is merely an FRC artifact.
As a result, the GPU would have to control everything that is currently handled by the monitor's controller + firmware, because only it has access to the original 10 bpp image. That would be counter productive, because then you'd also have to transport all the signaling information (for the monitor's backlighting and pixels) from the GPU to the monitor, which would require far more bandwidth than the 2 bpp you set out to save 😕
What you're thinking about is essentially a compression scheme to save bandwidth. Even if it did work, employing FRC in this way is lossy and nets you, at best, a 20% bandwidth reduction.
However, the DP1.4(a) standard already defines a compression scheme. DSC is lossless and nets you about 30%.That would be the way to do what you're thinking of.
Particularly 4k DP1.4 gaming monitors are in dire need of this. That nVidia and Acer/Asus would implement chroma subsampling 4:2:2 (which is also a lossy compression scheme) rather than DSC is shameful. 😳
I wonder if nVidia's newest $500+ g-sync module is even capable of DSC. I suspect it is not.
Zoolook - Friday, October 5, 2018 - link
DSC is not lossless, it's "visually lossless", which means that most of the time you shouldn't percieve a difference compared to an uncompressed stream.I'll reserve my judgement until I see some implementations.
Impulses - Tuesday, October 2, 2018 - link
That Asus PA32UC wouldn't get you G-Sync or refresh rates over 60Hz and it's still $975 tho... It sucks that the display market is so fractured and people who use their PCs for gaming as well as content creation can't get anything approaching perfect or even ideal at times.There's a few 4K 32" displays with G-Sync or Freesync but they don't go past 60-95Hz AFAIK and then you don't get HDR, it's all a compromise, and has been for years due to competing adaptive sync standards, lagging connection standards, a lagging GPU market, etc etc.
TristanSDX - Tuesday, October 2, 2018 - link
Soon there will be new PG27UC, with mini led backlight (10000 diodes vs 384) and with DSCDanNeely - Tuesday, October 2, 2018 - link
Eventually, but not soon. AUO is the only panel company working on 4k/high refresh/HDR; and they don't have anything with more dimming zones on their public road map (which is nominally about a year out for their production, add a few months to it for monitors makers to package them and get them to retail up once they start volume production of panels).FreckledTrout - Tuesday, October 2, 2018 - link
AUO have stated it lands this fall so should be very soon. They made it sound like they will have a shipping monitor by the end of 2018 albeit who really knows but im sure its under 1 year away at this point.Can Google: "AUO Expects to Launch Mini LED Gaming Monitor in 2H18"
imaheadcase - Wednesday, October 3, 2018 - link
Don't keep hopes hope, remember this monitor in this very review was delayed 6+ monthsLolimaster - Tuesday, October 2, 2018 - link
384 zones is just CRAP, you only find that number of zones on low end cheapo TV's with FALD just o be a bit more "premium". For that price is should have 1000 AT LEAST.Seems we will need to wait for LCD with minileds to actually start seeing monitors with 5000+zones.
know of fence - Tuesday, October 2, 2018 - link
Consoles started to push that 4K / HDR nonsense and now the monopoly provides a monitor to match for the more money than sense crowd. The obscure but sensible strobing backlight / ULMB got sacrificed for the blasted buzzwords and Gsync. Is it because the panel is barely fast enough for Gsync or is it a general shift in direction, doubling down on proprietary G-stink and the ridiculously superfluous 4K native. Is it because with failing VR, high frame rates are off the table completely?Is there any mention on how 1920x1080 looks on that monitor (too bad), because the pixel density is decidedly useless and non standard. But scaled down to half it could be 81.5 ppi and this thing can actually be used to read text.
godrilla - Tuesday, October 2, 2018 - link
$1799 at micr1 fyi!godrilla - Tuesday, October 2, 2018 - link
Microcenter*Hectandan - Tuesday, October 2, 2018 - link
"the most desired and visible aspects of modern gaming monitors: ultra high resolution (4K)"No it's not. At least on Windows where UI scaling still sucks. At least on "slow" graphics card like 2080 Ti where 4K doesn't run 144fps. And 4K monitors can't do 1440p natively, so a huge deal breaker.
Zan Lynx - Wednesday, October 3, 2018 - link
If you had a graphics card that could always run 144 Hz then you would have no need for GSync.imaheadcase - Wednesday, October 3, 2018 - link
Why would you care about scaling for gaming? Besides, plenty of 3rd party apps to correct windows bullshit.Hectandan - Thursday, October 4, 2018 - link
No I don't care about scaling in games, but I do care about 144fps in games. Only possible in 4K with SLI 2080 Ti and good game SLI support. Plenty of games don't.Also plenty of 3rd party apps not correcting Windows bullshit, and I gain no extra working space if I do scale.
Simply too many downsides and too little benefit.
HollyDOL - Wednesday, October 3, 2018 - link
When there is a 27" 4k HDR Eizo with G-Sync and 'CX' line+ picture quality, I'll be thinking of upgrade.Got only space for one screen and use cases are quite wide. Pro (job) usage has to get the priority.
FreckledTrout - Wednesday, October 3, 2018 - link
Couple that with a MIni LED backlight array for the HDR part and that would be one hell of a monitor.HollyDOL - Wednesday, October 3, 2018 - link
Hmm... the dilemma: new screen or new car :-)Impulses - Wednesday, October 3, 2018 - link
You probably spend more time in front of the screen than in the car! ;PHollyDOL - Thursday, October 4, 2018 - link
Is it okay to quote you when I negotiate the home budget :-) ?Kamus - Wednesday, October 3, 2018 - link
Article should be updated. The latest windows update finally fixes HDR support for windows (you now get a slider for SDR content, and it works fine)Lau_Tech - Wednesday, October 3, 2018 - link
I find the Brightness and Contrast charts confusing, would probably be easier to read if done as a simple table with min/max luminance.As for those who want to talk about mini-led vaporware, I dont want whatever it is you're smoking
Samus - Thursday, October 4, 2018 - link
This monitor consumes as much power as my entire PC while gaming. That's insane.CoryS - Friday, October 5, 2018 - link
I do photo editing, and would like to start doing 10 bit HDR video work (as a passionate hobby, not a profession) But also would like a GSYNC monitor for gaming.Would this or the upcoming 35 model be a good fit? I can't tell from the review if the color space support is adequate for the editing work I do.
Glenwing - Sunday, June 16, 2019 - link
Good article :) not to be nitpicky, but just a few typo corrections:"Notably, this isn’t enough bandwidth for any higher refresh rates, particularly not 144MHz"
I think you mean "Hz", not "MHz" :P
"I’ve seen it referred to >>as<< compression at some points"
Missing word added ^
"None the less, "
This is actually one word :P
Cheers ^.^
Ethos Evoss - Monday, December 28, 2020 - link
That's really bad as there is no any product which support DP 2.0 so we stucked with DP 1.4 with fake 144Hz So all monitors and TVs are obsolete... Only laptops which has 144 or 240 or 300Hz has real refresh rate as there is no any HDMi or DP but nobody talking about eDP (embeded display port) which is in laptops and which is in reality supported up to 120Hz so how on earth they making laptops with 144Hz or 240 or even 300Hz ???