Lossy compression was unavoidable. It's impossible to even guarantee any compression with a mathematically lossless compression system let alone a constant bit rate. Hopefully the compression will be controllable and only required at the highest resolutions and refresh rates and uncompressed will be the default any time it is possible.
I believe that the compression will compress up to 36 bits per pixel (12 bits per pixel colour component) down to 8 bits (or maybe the 8 bits is for the 24-bit source, and 36-bit sources will require 12 bits, this detail isn't available in the article). In effect, bandwidth requirements go down to 1/3 or even 1/4 of what they were before.
Other (slower) lossy compression algorithms actually get down to under 1 bit per pixel, so having 8x the data to work with will mean that any loss in the resulting image will be imperceptible (hopefully). This is what @mavere was getting at.
1920x1200@96Hz is sharp on my FW900. The FW900 also accepts 2560x1600@75Hz. At 1600p, uniformity buckles a bit as it's noticeably crisper in the center than it is near the edges, but all the text is still legible. I only use that resolution for certain games and applications.
Of course the additional benefits of being on a CRT still apply - unbeatable contrast, color accuracy, motion blur, and latency.
Mathematically (actually) lossless: completely lossless. The decoded output is bit-for-bit identical to the input.
Visually lossless: lossy with little-to-no image degradation. The decoded output is not bit-for-bit identical to the input, but should still appear undegraded to the viewer.
Apologies for my sarcasm not being so obvious over the internet! My clumsily made point is that it either is or isn't lossless. Saying something is partially lossless is a bit like saying someone is partially pregnant. It's not a continuous scale when the word "lossless" is used.
In addition, a major problem I have with the "visually lossless" claim is; who decides what is or isn't perceptible? Different observers will have different thresholds. You can also bet they're going to push as far as they can with the compression. While I'm sure it will still, in large part, depend on the content (much like the actually lossless compression used in some image formats,) I have a feeling that a compression level that is imperceptible isn't going to give huge savings. It'll be interesting to see where it goes.
The algorithm and specifications were decided by the VESA working with MIPI. So the short answer is those two organizations. These are the VESA's results published in their whitepaper:
---
From a picture quality standpoint, the DSC algorithm outperforms many proprietary algorithms. The algorithm was rigorously tested by experts on a variety of mobile and large-‐panel display types. All types of test content were included:
White noise, zone plates, multiburst, and other test patterns High-‐density subpixel rendered text (for example, Microsoft ClearType) Computer, phone, and tablet screen captures Photos and video
Several methods were used to confirm the visually lossless performance of the algorithm. Most commonly, the original image and uncompressed image were flipped back and forth in place on the same screen to determine if the user could see a difference. The images were selected for their difficulty. One company analyzed thousands of images gathered from the internet and selected those that were most likely to have artifacts based on a mathematical analysis of the compressed images. The selected images were subjected to the flipping test. All of the analyses showed that the DSC algorithm outperformed five other proprietary algorithms on these picture quality tests, and was either visually lossless or very nearly so for all tested images at 8 bits/pixel.
This should be made an optional part of the standard for use in cinemas or for people daisy-chaining three or four monitors. 99.9999% will not exceed 32Gbps and there is no need for this technology for them.
Would it add 1 frame of latency (i.e., the frame can only be rebuilt with all the data present), or is the algorithm capable of being decompressed on the fly (or, e.g., every 8 lines if the compression works in 8x8 blocks), resulting in a sub 1 frame latency?
1. What about latency? 2. What is the point of 8K Display if we are getting lossy image from it? Surely we should move to Optical Cable instead and tackle the problem as intended instead of hacking around it?
1. Looking at the whitepaper, latency should be zero as lines are basically separately compressed. Codecs like H.264 add a lot of latency because of inter frame prediction which DSC does not have.
2. There's plenty of reason to compress: copper is still much cheaper, and there really isn't that much data in video. Between 4K uncompressed and 8K compressed at the same bandwidth, 8K should still look much better.
Copper is generally higher bandwidth and lower latency over short distances than digital-optical. Mostly because you have to convert from and back to electrical signaling.
A 66% reduction in bitrate seems pretty conservative. Even an intraframe algorithm (which has zero latency) such as MJPEG should appear lossless with that much bandwidth available.
An intraframe algorithm doesn't have zero latency, because if you're working on entire frames, you have to buffer the whole thing before you can start sending. Of course, if you use MJPEG and use restart markers at the end of every row of macroblocks, you're only adding 8 to 16 scanlines of latency.
It's a stream compression protocol, not a frame compression protocol. It has to be low cost to implement - frame compression requires RAM to decompress into, compare with N previous frames, etc. And MJPEG is expensive to encode on the fly compared to this stream compression algorithm.
Well as the table in the article indicates 4K shouldn't be affected. 8K won't be mainstream for 5 years at least (IMO) and by then we should have a displayport connector that is twice the bandwidth of displayport 1.3 (which is being standardised in a few months) if displayport technology keeps progressing as it is. Therefore this won't affect most people, however those who plan to buy 8K displays before they get to consumer available prices will likely be in a bit of trouble.
Of course 8K at the distance most people sit from a screen (2 inches) is according to http://wolfcrow.com/blog/notes-by-dr-optoglass-the... slightly greater "resolution" than the healthiest possible adult eye can see for a 22 inch screen. However as the compression algorithm reduces 24bits per pixel down to 8 that suggests the amount of colours to be displayed is lowered which could be very bad if that is even the slightest bit decernable by artists (who are probably the only ones who are dedicated enough to buy early 8K monitors AND notice the difference).
I wonder if leading 2 displayport cables into a single monitor is possible, because 2 display port 1.3 cables are cable of 8K 60hertz (that seems easier to me to implement than a compression algorithm). Although perhaps higher screen refresh rates will be of more benefit than 4K to 8K to a huge portion of people.
Let me clarify a few things with my above post: 1. If the manufactures are concerned they won't be able to keep up, perhaps that indicates they would be able to approximately double displayport 1.4 bandwidth in something like 5 years. 2. When I mention 8K at 2inches, I meant a slight reduction in compressive resolution wouldn't be much of a problem. 3. I am not 100% certain that 2 displayport 1.3 equals 8K @ 60 hertz 4. Perhaps I was hasty in judging 5 years for 8K, could be 2 years could be 10. 5. LCD to LED screens are imo a significant improvement which is associated to bit depth, while there are certain colours that humans have trouble discerning and LED screens obviously improve in critical areas; 24 bit to 8 bit bit depth seems a bit much to me.
I really don't understand why no one introduced an optical cable for video. We have had optical audio cables for audio for many years - you can buy one for a few bucks.
On the other hand, I can see the value of a stream compression standard, because of the desire to stream content to a monitor wirelessly. It's likely that in 5 years most of us will want to stream high resolution content to our displays from our mobile devices, and that's when standards like this will become useful.
That optical audio cable transmits at most ~5 Mb/s (24bit/96kHz/2ch). The display standard is 50 Gb/s, 10,000 times faster.
The reason optical audio cables are cheap is that they don't need to carry a high bit rate of data so plastic fibers with poor optical properties are more than adequate. You won't get 50 Gb/s through a plastic fiber used for TOSLINK.
Multimode fiber can do 100Gbps at distances up to 150m [1]. Good luck trying to do that with copper. You have to keep in mind that Average Joe is perfectly happy paying $20+ for a cable at BestBuy. Especially when he's buying a brand new 4k TV. And from the marketing point of view, a lossy video link is just not very appealing to someone who's ready to dump a sh!tload of money for the latest, highest res TV.
Fiber TxRx costs more than copper in pretty much all cases. Sure it's a better medium and at these bitrates it'd even be lower power but do you really think they'll switch to a more expensive medium when they're actually damaging data in order to save money?
They are damaging data because some exec said "we gotta do this over copper".
It's not obvious that fiber is more expensive. With fiber you don't need a decoding chip in a display, only the optical to electrical converter. Let's see what costs more - a new, very high data rate decoder chip, or a converter that has been mass produced for the last 30 years?
Undoubtedly the optics. Take a look at the numbers. Fiber is expensive. Really expensive. Tens of thousands of dollars expensive. 40g is expensive. 100g is expensive. Pushing 20g over copper is a miracle. There's a barrier they're pushing against here and they took the cheaper route because that's the world we live in today.
I don't think 8K will be mainstream for at least another 10 years. Probably more.
People don't watch 2 inch away from the screen, of course.
8 bpp (bit per pixel) isn't about color depth, it's a measure of average data rate. For example, high quality Bluray movies are around 1 bpp, or less, though it's a compression format with different goals and requirements than DSC.
What do you mean LCD to LED screens? LED screen is a misnomer, unless you mean OLED. "LED screens" (LCD screens with LED backlighting) generally use the same panel technology as "LCD screens" (LCD screens with CCFL backlighting), just the backlight is different. Arguably LED backlights were inferior on introductions and for a while afterwards. Even now it might not be better.
One thing that is getting slightly more common is 10-bit and extended range panels. This should be a real improvement in quality, but it's still far from mainstream. In particular, the software and graphics content have to catch up. On the other hand, many TN panels, and other types as well, are still actually 6-bit per pixel.
The effective bit depth with still be 24, 30, whatever it was in the first place, so the amount of colors is NOT reduced. It's (very mild) compression. People need to realize that codecs like H264 usually get like 1:15 - 1:30 compression ratios. Yeah that means the output file is 15 - 30 times smaller. The lower end of that is Blue Ray quality.
THIS compression is LESS THAN 1:2. The output size on this is 66% of that it was in the first place. (Compared to 3-6% in typical H264)
It will be extremely difficult to notice any compression artifacts!
Given that much of the video content being sent from a computer or set top box to a display is anyway probably coming from a compressed source (e.g. Mpeg 2, H.264, etc) why even bother with adding the latency of decoding the H.264 on the computer, then re-encoding with DSC, only to have to decode again on the display? It would seem smarter to me if they came up with a standard whereby the computer could auto-negotiate with the display and if the display responds that it is capable of decoding the existing stream, then skip the decoding on the computer entirely (no need for DSC) and pass the compressed stream directly through to the monitor.
My guess is, it's because decoding H.264 or H.265 is computationally intensive, and they don't want to ask display manufacturers to use expensive decoder chips in displays.
h.264 decoders are commodity hardware, they cost next to nothing. That's not true of h.265, obviously.
The problem with that approach is that not all source content is h.264 encoded. Television might be, but it might also be MPEG-2 (OTA, many digital cable/satellite services) or VC-1 (many IPTV services). Videogames and PC output isn't encoded at all.
Another thing that might be relevant is that there are many ways to decode MPEG, with different resultant pixels, and all these outputs may be technically valid. For a monitor-oriented format you'd probably want the output to be the same regardless of decoder, or at least be more stringent in what's considered valid decoded output.
So what happens when you play games or just sit and stare at the desktop? All of a sudden you're wasting a ton of power for real time compression when it's unnecessary and adding a lot of latency (in the order of ms, even into the 10s on a bad design).
What would be better is a fresh standard that let the display driver tell the monitor (on next frame I'll send you updates for these pixels) and then only update the pixels that change. Ooo ahhh we save power but peak bandwidth requirements are still the same. If you want a big resolution you need a big bandwidth. This is just a money band aid.
Not again. Let's not repeat the hell of compressed audio and its ever changing codecs (DOLBY, DTS, DTS TRUE FULL PRO HD 2 ULTRA SUPER MEGA III). I don't want to replace my TV just because it doesn't support the new compression codec.
There are only two relevant sets of audio compression standards, and the only top-end one with any significant use (DTS-HD) is backwards compatible. That is to say that if you take a DTS-HD stream and try to play it back on a regular DTS encoder that has no idea what DTS is, it will work perfectly fine, because DTS-HD contains a regular DTS stream inside it.
There are no real ever changing codecs. In fact, that's one of the fantastic things about DTS-HD Master Audio.
The core 1.5Mbps stream is decodable by any old gear that supports DTS Coherent Acoustics. Backwards compatible.
Newer gear also decodes the lossless difference to produce a lossless stream.
DTS Coherent Acoustics is already a great sounding codec (not because of complexity, but mainly because of the high bitrate), so if you have to fall back to it, it's no great shakes. With the lossless stream on top, you're getting mathematically lossless output.
It's also the clear winner in terms of blu-ray soundtracks right now, so it's the one you need to worry about.
it should be noted that the real rec2020 3840x2160@60Hz UHD-1(4K) and 7680x4320@60Hz UHD-2(8K) requirement is in fact currently up to 120Hz for both, and if NHK/BBC r&d are right then its may go up to 240Hz or even 300Hz by the expected official 7680x4320 UHD-2 release for the Tokyo 202 Olympic games.
so we have a problem as that's only 6 years away so even this vesa/mipi spec is already under powered before its available.
its been stated many times by NHK the inventors and bbc R&D that 2020 was the goal for full broadcasting service, the most resent here, theres a reason the specs called "rec 2020" 10bitpp/12bpp real colour when it was ratified.
http://advanced-television.com/2014/03/18/japan-co... "A statement from Japanese public broadcaster NHK on March 17 stated again that it was readying for test transmissions in 8K to start in 2016, “and full broadcasting service in 2020, the year of Tokyo Olympics and Paralympics.” NHK says there is an air of heightened anticipation for the evolution of a broadcasting style that gives an elevated sense of reality in various genres, such as sport, live music, film and drama. “8K is the next step in this evolution,” says NHK."
Average internet speed in US right now is 10Mbps. Let's be very optimistic and imagine in 6 years it will be 100Mbps. That's still barely enough for streaming 4k@60Hz. In fact, judging by the fact that streaming Netflix currently at 1080p is pretty much impossible regardless of your home internet connection speeds (even ignoring the fact that their 1080p is horribly compressed). I will be impressed if by 2020 in US we will be able to stream movies in present BluRay quality.
"Average internet speed in US right now is 10Mbps" i can see how that might be a problem for you in the us , (didnt i read somewhere your govt payed a new subsidies plan for more fiber again), but everywhere else in the main land EU/UK/Ireland etc can get fast 50/100+ Megabits and theres also the latest DVB-T2 over the air broadcasting too that we know NHK/BBC r&d demoed again successfully
Actually, according to all the sources I could find with a quick google search, the avg internet speeds in UK are roughly the same as in US (which is surprising to me actually, comparing the size of the area to cover).
well don't believe everything you Google on the internet :)
i can tell you for sure i run 100+ Megabits from uk on virgin and also 50 Megabits in Ireland and depending where i was at the time anything from 30 megabits to 1Gb/s , anyway we are going off topic , UHD is coming very soon and 120Hz is the spec we need in any real UHD-1 rec 2020
I actually have to interject here. I'm sitting here in Germany but it is much the same in many EU Countrys including the UK: Yes, we can get 50MBit/100Mbit Internet in many locales (as I have 50MBit in an metropolitan area) But, many areas are lucky if they get 16MBit (A-DSL, as my sister has with ~2MBit) Aditionally, many people who can get 50MBit/100MBit opt for 16MBit because they don't want to Pay more. even though it is just a 5€-10€ premium. So the average speed in the EU is depending on country in the 5MBit to perhaps 15MBit range an on par with other industrial nations.
You should be less forgiving with marketing bullshit and tell a lie when you see one. Visually lossless is a contradiction in terms and is highly deceiptive. Even the sound industry didn't dare to go so far in bullshit.
DSC is lossy. Even if it is 100 % indistinguishable from a non compressed image by the average joe, it is lossy.
VESA loses all credibility spreading this BS. VESA, just say the truth.
it means that it's not visible to humans, not that it's lossless. That so hard to understand? The image goes to the monitor and then gets in your eyes and your brain elaborates it. What is important is the final result, what's in the brain, not the outside world.
"... while DSC enabled devices are still some time off – the fact that the standard was just ratified means new display controllers still need to be designed and built... "
Why will this be a concern given that DisplayPort 1.3 hasn't yet been announced?
"... we also expect DSC’s inclusion in the forthcoming DisplayPort 1.3."
This suggests that DSC will be included from the start. Unless DisplayPort 1.3 controllers are being worked on already?
What's interesting is that the recently-leaked Thunderbolt 3, scheduled for release alongside Skylake in autumn/fall 2015, only has support for DP 1.2. Therefore, we will probably have to wait for Thunderbolt 4, which will presumably be released in autum/fall 2017, for DP 1.3 and DSC support.
This compression is totally nonsense. High res displays (4K, 8K) shows biggest advantage while displaying edges (fonts, lines, sharp borders of regions etc). And these elements are ALWAYS blurred by ANY non-loseless compression alghoritms. This way 4/8K will be visually downgraded to FHD (or less) with blurred edges.
So we'll have a really clear high res look at crappy compressed information. I just hope it has a bypass with zero compression if using a display below a given factor. Then that factor will be my upward bounding limit for displays I buy.
Are gamers really all that different from audiophiles? I hope so.
There are obvious things than can be done to result in dramatic compression of the frame buffer. You don't need 32 bits of color information per line. Worst case, each pixel is a different color. That's 8192 colors on a line. 13 bits...
(Reminds me of Amiga HAM mode.)
Is there anything at all in the spec that addresses panel self-refresh?
Excuse me, and how many bits do you employ to code the dictionary for addressing 8192 colors. Bear in mind that each line can have a different dictionary. So, you are missing lots of bits in your math
Hmm... Yes, you are right, I need a look-up-table for each line. But I won't need 32 bits for each pixel on the line. I still think I can get at least one order of magnitude reduction of data (factor of 10) with a per-line LUT... I will try it. If I'm wrong, I will post a reply!
Don't use "the" with acronyms, it is clunky and makes the text difficult to read. If the reader is going to just pronounce the acronym and not try to unpack it into its components, then "the" is incorrect.
DSC use DPCM, and it's hilarious to use this kind of "technology" for high-end monitors. Ghosting, Vertical lines that will change depending on pixels preceding it at each line, it has just no sense on 8K displays for professionnals!
And on the mobile world, it is a non-sense too to put full HD displays on a smartphone or maybe this year Ultra HD/4K displays on a tablet. Drop the DSC and stay with non-marketing screens resolutions, this is the way to go!
All this thing to please marketing departments, it's just weird!
Wouldn't it be easier and better just to do the obvious and use a new spec to stop transmitting all that, "Make the same screen" over and over the cord instead?
Just tell the monitor to keep displaying the same image until told otherwise. Boom, tons of savings right there. Imagine the savings on a tablet or a PC. Instead, what we have is a whole series of products built around VGA that continue to send data continuously for the entire screen when it'd be far better just to send the only part of the screen that changed.
Compression's not needed for that. Better monitors (or converters for existing monitors) are needed for that.
G-sync/Freesync shows us the way. But that'd involve no big news, right? So instead, VESA makes a mountain of a molehill and makes far more of it than should be, pushing into lossy codecs instead of doing the obvious things first.
As a FLAC snob, i am sad i will now have to be a mathematically lossless video snob as well, only here, it looks like i will not get the option. This is understandable for mobile and such, but on a desktop in my home where power and number of cords is not a limitation, i will always want to option of lossless. I hope VESA understands this.
@HisDivineOrder: What you are proposing is esentially a rudimentary, losless comression. (and it would be interframe so: lag, RAM, ... see previous posts.)
@sonicmerlin: The problem with losless is that it has to be adaptive, witch means it could propably get really good compression rates most of the time. But, every losless compression has "perfect storm" data patterns were it can get no compression at all. Or even worse, it adds data. When you are bandwidth-constraint you have to guarantee a compression factor to always stay inside the limit. You can't do that with a "perfect storm" pattern so you can't do that with lossless.
I just hope that for graphical work you can choose a lower refresh rate (i.e. 30Frames/sec) without compression.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
85 Comments
Back to Article
nathanddrews - Tuesday, April 22, 2014 - link
NOoooooooooooooooooo!!!!!http://youtu.be/WWaLxFIVX1s
nathanddrews - Tuesday, April 22, 2014 - link
Sorry, I just had to get that out. We already have to deal with bad compression on our media, I don't want it on my display./humbug
kpb321 - Tuesday, April 22, 2014 - link
Lossy compression was unavoidable. It's impossible to even guarantee any compression with a mathematically lossless compression system let alone a constant bit rate. Hopefully the compression will be controllable and only required at the highest resolutions and refresh rates and uncompressed will be the default any time it is possible.mavere - Tuesday, April 22, 2014 - link
At 8 bpp, the encoder would need to be hilariously terrible for anyone to even have a chance of seeing a picture artifact.Seriously, the buffer zone is massive. The main engineering challenge here is latency, not coding "quality".
nathanddrews - Wednesday, April 23, 2014 - link
Who wants 8bpp? I thought we were moving to 10bpp...psychobriggsy - Wednesday, April 23, 2014 - link
I believe that the compression will compress up to 36 bits per pixel (12 bits per pixel colour component) down to 8 bits (or maybe the 8 bits is for the 24-bit source, and 36-bit sources will require 12 bits, this detail isn't available in the article). In effect, bandwidth requirements go down to 1/3 or even 1/4 of what they were before.Other (slower) lossy compression algorithms actually get down to under 1 bit per pixel, so having 8x the data to work with will mean that any loss in the resulting image will be imperceptible (hopefully). This is what @mavere was getting at.
nathanddrews - Wednesday, April 23, 2014 - link
Got it, thanks for clarifying.sully213 - Friday, April 25, 2014 - link
....you bastard!https://www.youtube.com/watch?v=BWUP5QxdwPg
MikeMurphy - Tuesday, April 22, 2014 - link
I find it amazing that an old-school VGA cable can drive a 1920x1200 display at 60hz. Analogue, but still crystal clear.I hope the industry develops a really good cable technology (ie. fiber) that can drive such displays. Compression should be optional.
Gigaplex - Tuesday, April 22, 2014 - link
I've seen a VGA cable drive a 1920x1200 display at 60Hz. It is not crystal clear.nathanddrews - Wednesday, April 23, 2014 - link
VGA is far more capable than that.1920x1200@96Hz is sharp on my FW900. The FW900 also accepts 2560x1600@75Hz. At 1600p, uniformity buckles a bit as it's noticeably crisper in the center than it is near the edges, but all the text is still legible. I only use that resolution for certain games and applications.
Of course the additional benefits of being on a CRT still apply - unbeatable contrast, color accuracy, motion blur, and latency.
psychobriggsy - Wednesday, April 23, 2014 - link
You need to buy a Monster VGA cable :pnevertell - Wednesday, April 23, 2014 - link
It's as crystal clear as your eyesight after a litre of vodka drank from a crystal bottle.boozed - Tuesday, April 22, 2014 - link
What's the difference between "visually lossless" and actually lossless? Other than the weasel words.Ryan Smith - Tuesday, April 22, 2014 - link
Mathematically (actually) lossless: completely lossless. The decoded output is bit-for-bit identical to the input.Visually lossless: lossy with little-to-no image degradation. The decoded output is not bit-for-bit identical to the input, but should still appear undegraded to the viewer.
boozed - Tuesday, April 22, 2014 - link
Apologies for my sarcasm not being so obvious over the internet! My clumsily made point is that it either is or isn't lossless. Saying something is partially lossless is a bit like saying someone is partially pregnant. It's not a continuous scale when the word "lossless" is used.In addition, a major problem I have with the "visually lossless" claim is; who decides what is or isn't perceptible? Different observers will have different thresholds. You can also bet they're going to push as far as they can with the compression. While I'm sure it will still, in large part, depend on the content (much like the actually lossless compression used in some image formats,) I have a feeling that a compression level that is imperceptible isn't going to give huge savings. It'll be interesting to see where it goes.
Ryan Smith - Tuesday, April 22, 2014 - link
"who decides what is or isn't perceptible?"The algorithm and specifications were decided by the VESA working with MIPI. So the short answer is those two organizations. These are the VESA's results published in their whitepaper:
---
From a picture quality standpoint, the DSC algorithm outperforms many proprietary algorithms. The algorithm was rigorously tested by experts on a variety of mobile and large-‐panel display types. All types of test content were included:
White noise, zone plates, multiburst, and other test patterns
High-‐density subpixel rendered text (for example, Microsoft ClearType)
Computer, phone, and tablet screen captures
Photos and video
Several methods were used to confirm the visually lossless performance of the algorithm. Most commonly, the original image and uncompressed image were flipped back and forth in place on the same screen to determine if the user could see a difference. The images were selected for their difficulty. One company analyzed thousands of images gathered from the internet and selected those that were most likely to have artifacts based on a mathematical analysis of the compressed images. The selected images were subjected to the flipping test. All of the analyses showed that the DSC algorithm outperformed five other proprietary algorithms on these picture quality tests, and was either visually lossless or very nearly so for all tested images at 8 bits/pixel.
3DoubleD - Wednesday, April 23, 2014 - link
You should include that in the article, that was a key discussion that should have been in there in the first place.ggathagan - Wednesday, April 23, 2014 - link
Ryan link to that whitepaper in the article.ggathagan - Wednesday, April 23, 2014 - link
Ryan linked to that whitepaper in the article, I meant.retroneo - Tuesday, April 29, 2014 - link
With "thousands of images gathered from the internet" the "original image and uncompressed image were flipped back and forth in place"Unfortunately, thousands of high-res images from the the internet are likely to be already lossy compressed using schemes such as JPG.
I would have been more assured if they explicitly said they tested with uncompressed RAW images.
boozed - Wednesday, April 23, 2014 - link
Thanks for the response Ryan.I think I'd like to reserve my scepticism, but the proof will be in the eating, as they say.
CSMR - Tuesday, April 22, 2014 - link
This should be made an optional part of the standard for use in cinemas or for people daisy-chaining three or four monitors. 99.9999% will not exceed 32Gbps and there is no need for this technology for them.SergeC - Tuesday, April 22, 2014 - link
Latency added?Ryan Smith - Tuesday, April 22, 2014 - link
Yes. DSC would add latency.psychobriggsy - Wednesday, April 23, 2014 - link
Would it add 1 frame of latency (i.e., the frame can only be rebuilt with all the data present), or is the algorithm capable of being decompressed on the fly (or, e.g., every 8 lines if the compression works in 8x8 blocks), resulting in a sub 1 frame latency?psychobriggsy - Wednesday, April 23, 2014 - link
Ah @madmilk below says it's line-based compression, so should be imperceptible latency.Alexey291 - Thursday, April 24, 2014 - link
right so what we have so far (not a dig at you personally) is"Image degradation should be imperceptible"
and
"Latency should be imperceptible"
Yup because that usually works out very well doesn't it?
iwod - Tuesday, April 22, 2014 - link
1. What about latency?2. What is the point of 8K Display if we are getting lossy image from it? Surely we should move to Optical Cable instead and tackle the problem as intended instead of hacking around it?
madmilk - Tuesday, April 22, 2014 - link
1. Looking at the whitepaper, latency should be zero as lines are basically separately compressed. Codecs like H.264 add a lot of latency because of inter frame prediction which DSC does not have.2. There's plenty of reason to compress: copper is still much cheaper, and there really isn't that much data in video. Between 4K uncompressed and 8K compressed at the same bandwidth, 8K should still look much better.
ivan256 - Wednesday, April 23, 2014 - link
Copper is generally higher bandwidth and lower latency over short distances than digital-optical. Mostly because you have to convert from and back to electrical signaling.p1esk - Wednesday, April 23, 2014 - link
Any links to back that up?Multimode fiber can do 100Gbps at distances up to 150m [1]. Good luck trying to do that with copper, even at 1m.
[1] http://www.fols.org/fols_library/white_papers/docu...
madwolfa - Wednesday, April 23, 2014 - link
He said - short distances.p1esk - Wednesday, April 23, 2014 - link
Yes, and I have shown that at short distances, plastic fiber beats copper by at least a factor of 10.Gnarr - Thursday, February 19, 2015 - link
Latency and bandwidth are two separate things. You haven't shown anything.Gnarr - Thursday, February 19, 2015 - link
Optical can surely be slower:http://images.anandtech.com/graphs/graph7170/56389...
http://www.anandtech.com/show/7170/impact-disrupti...
madmilk - Tuesday, April 22, 2014 - link
A 66% reduction in bitrate seems pretty conservative. Even an intraframe algorithm (which has zero latency) such as MJPEG should appear lossless with that much bandwidth available.Guspaz - Wednesday, April 23, 2014 - link
An intraframe algorithm doesn't have zero latency, because if you're working on entire frames, you have to buffer the whole thing before you can start sending. Of course, if you use MJPEG and use restart markers at the end of every row of macroblocks, you're only adding 8 to 16 scanlines of latency.psychobriggsy - Wednesday, April 23, 2014 - link
It's a stream compression protocol, not a frame compression protocol. It has to be low cost to implement - frame compression requires RAM to decompress into, compare with N previous frames, etc. And MJPEG is expensive to encode on the fly compared to this stream compression algorithm.afree - Tuesday, April 22, 2014 - link
Well as the table in the article indicates 4K shouldn't be affected. 8K won't be mainstream for 5 years at least (IMO) and by then we should have a displayport connector that is twice the bandwidth of displayport 1.3 (which is being standardised in a few months) if displayport technology keeps progressing as it is. Therefore this won't affect most people, however those who plan to buy 8K displays before they get to consumer available prices will likely be in a bit of trouble.Of course 8K at the distance most people sit from a screen (2 inches) is according to http://wolfcrow.com/blog/notes-by-dr-optoglass-the... slightly greater "resolution" than the healthiest possible adult eye can see for a 22 inch screen. However as the compression algorithm reduces 24bits per pixel down to 8 that suggests the amount of colours to be displayed is lowered which could be very bad if that is even the slightest bit decernable by artists (who are probably the only ones who are dedicated enough to buy early 8K monitors AND notice the difference).
I wonder if leading 2 displayport cables into a single monitor is possible, because 2 display port 1.3 cables are cable of 8K 60hertz (that seems easier to me to implement than a compression algorithm). Although perhaps higher screen refresh rates will be of more benefit than 4K to 8K to a huge portion of people.
afree - Tuesday, April 22, 2014 - link
Let me clarify a few things with my above post:1. If the manufactures are concerned they won't be able to keep up, perhaps that indicates they would be able to approximately double displayport 1.4 bandwidth in something like 5 years.
2. When I mention 8K at 2inches, I meant a slight reduction in compressive resolution wouldn't be much of a problem.
3. I am not 100% certain that 2 displayport 1.3 equals 8K @ 60 hertz
4. Perhaps I was hasty in judging 5 years for 8K, could be 2 years could be 10.
5. LCD to LED screens are imo a significant improvement which is associated to bit depth, while there are certain colours that humans have trouble discerning and LED screens obviously improve in critical areas; 24 bit to 8 bit bit depth seems a bit much to me.
p1esk - Wednesday, April 23, 2014 - link
I really don't understand why no one introduced an optical cable for video. We have had optical audio cables for audio for many years - you can buy one for a few bucks.On the other hand, I can see the value of a stream compression standard, because of the desire to stream content to a monitor wirelessly. It's likely that in 5 years most of us will want to stream high resolution content to our displays from our mobile devices, and that's when standards like this will become useful.
The Von Matrices - Wednesday, April 23, 2014 - link
That optical audio cable transmits at most ~5 Mb/s (24bit/96kHz/2ch). The display standard is 50 Gb/s, 10,000 times faster.The reason optical audio cables are cheap is that they don't need to carry a high bit rate of data so plastic fibers with poor optical properties are more than adequate. You won't get 50 Gb/s through a plastic fiber used for TOSLINK.
p1esk - Wednesday, April 23, 2014 - link
Multimode fiber can do 100Gbps at distances up to 150m [1]. Good luck trying to do that with copper.You have to keep in mind that Average Joe is perfectly happy paying $20+ for a cable at BestBuy. Especially when he's buying a brand new 4k TV. And from the marketing point of view, a lossy video link is just not very appealing to someone who's ready to dump a sh!tload of money for the latest, highest res TV.
[1] http://www.fols.org/fols_library/white_papers/docu...
willis936 - Wednesday, April 23, 2014 - link
Fiber TxRx costs more than copper in pretty much all cases. Sure it's a better medium and at these bitrates it'd even be lower power but do you really think they'll switch to a more expensive medium when they're actually damaging data in order to save money?p1esk - Wednesday, April 23, 2014 - link
They are damaging data because some exec said "we gotta do this over copper".It's not obvious that fiber is more expensive. With fiber you don't need a decoding chip in a display, only the optical to electrical converter. Let's see what costs more - a new, very high data rate decoder chip, or a converter that has been mass produced for the last 30 years?
willis936 - Thursday, April 24, 2014 - link
Undoubtedly the optics. Take a look at the numbers. Fiber is expensive. Really expensive. Tens of thousands of dollars expensive. 40g is expensive. 100g is expensive. Pushing 20g over copper is a miracle. There's a barrier they're pushing against here and they took the cheaper route because that's the world we live in today.sheh - Wednesday, April 23, 2014 - link
I don't think 8K will be mainstream for at least another 10 years. Probably more.People don't watch 2 inch away from the screen, of course.
8 bpp (bit per pixel) isn't about color depth, it's a measure of average data rate. For example, high quality Bluray movies are around 1 bpp, or less, though it's a compression format with different goals and requirements than DSC.
What do you mean LCD to LED screens? LED screen is a misnomer, unless you mean OLED. "LED screens" (LCD screens with LED backlighting) generally use the same panel technology as "LCD screens" (LCD screens with CCFL backlighting), just the backlight is different. Arguably LED backlights were inferior on introductions and for a while afterwards. Even now it might not be better.
One thing that is getting slightly more common is 10-bit and extended range panels. This should be a real improvement in quality, but it's still far from mainstream. In particular, the software and graphics content have to catch up. On the other hand, many TN panels, and other types as well, are still actually 6-bit per pixel.
extide - Wednesday, April 23, 2014 - link
The effective bit depth with still be 24, 30, whatever it was in the first place, so the amount of colors is NOT reduced. It's (very mild) compression. People need to realize that codecs like H264 usually get like 1:15 - 1:30 compression ratios. Yeah that means the output file is 15 - 30 times smaller. The lower end of that is Blue Ray quality.THIS compression is LESS THAN 1:2. The output size on this is 66% of that it was in the first place. (Compared to 3-6% in typical H264)
It will be extremely difficult to notice any compression artifacts!
psychobriggsy - Wednesday, April 23, 2014 - link
It'll allow you to daisy chain three 4K monitors off a single port at 60Hz, instead of one 4K monitor, even with DP1.2.sunbear - Tuesday, April 22, 2014 - link
Given that much of the video content being sent from a computer or set top box to a display is anyway probably coming from a compressed source (e.g. Mpeg 2, H.264, etc) why even bother with adding the latency of decoding the H.264 on the computer, then re-encoding with DSC, only to have to decode again on the display? It would seem smarter to me if they came up with a standard whereby the computer could auto-negotiate with the display and if the display responds that it is capable of decoding the existing stream, then skip the decoding on the computer entirely (no need for DSC) and pass the compressed stream directly through to the monitor.p1esk - Wednesday, April 23, 2014 - link
My guess is, it's because decoding H.264 or H.265 is computationally intensive, and they don't want to ask display manufacturers to use expensive decoder chips in displays.Guspaz - Wednesday, April 23, 2014 - link
h.264 decoders are commodity hardware, they cost next to nothing. That's not true of h.265, obviously.The problem with that approach is that not all source content is h.264 encoded. Television might be, but it might also be MPEG-2 (OTA, many digital cable/satellite services) or VC-1 (many IPTV services). Videogames and PC output isn't encoded at all.
p1esk - Wednesday, April 23, 2014 - link
We are talking about decoders that can handle 50Gbps streams in real time. Regardless of the codec used, the chips that can do that will not be cheap.sheh - Wednesday, April 23, 2014 - link
Another thing that might be relevant is that there are many ways to decode MPEG, with different resultant pixels, and all these outputs may be technically valid. For a monitor-oriented format you'd probably want the output to be the same regardless of decoder, or at least be more stringent in what's considered valid decoded output.willis936 - Wednesday, April 23, 2014 - link
So what happens when you play games or just sit and stare at the desktop? All of a sudden you're wasting a ton of power for real time compression when it's unnecessary and adding a lot of latency (in the order of ms, even into the 10s on a bad design).What would be better is a fresh standard that let the display driver tell the monitor (on next frame I'll send you updates for these pixels) and then only update the pixels that change. Ooo ahhh we save power but peak bandwidth requirements are still the same. If you want a big resolution you need a big bandwidth. This is just a money band aid.
danbob999 - Wednesday, April 23, 2014 - link
Not again. Let's not repeat the hell of compressed audio and its ever changing codecs (DOLBY, DTS, DTS TRUE FULL PRO HD 2 ULTRA SUPER MEGA III).I don't want to replace my TV just because it doesn't support the new compression codec.
Guspaz - Wednesday, April 23, 2014 - link
There are only two relevant sets of audio compression standards, and the only top-end one with any significant use (DTS-HD) is backwards compatible. That is to say that if you take a DTS-HD stream and try to play it back on a regular DTS encoder that has no idea what DTS is, it will work perfectly fine, because DTS-HD contains a regular DTS stream inside it.piroroadkill - Wednesday, April 23, 2014 - link
There are no real ever changing codecs. In fact, that's one of the fantastic things about DTS-HD Master Audio.The core 1.5Mbps stream is decodable by any old gear that supports DTS Coherent Acoustics. Backwards compatible.
Newer gear also decodes the lossless difference to produce a lossless stream.
DTS Coherent Acoustics is already a great sounding codec (not because of complexity, but mainly because of the high bitrate), so if you have to fall back to it, it's no great shakes. With the lossless stream on top, you're getting mathematically lossless output.
It's also the clear winner in terms of blu-ray soundtracks right now, so it's the one you need to worry about.
BMNify - Wednesday, April 23, 2014 - link
it should be noted that the real rec2020 3840x2160@60Hz UHD-1(4K) and 7680x4320@60Hz UHD-2(8K) requirement is in fact currently up to 120Hz for both, and if NHK/BBC r&d are right then its may go up to 240Hz or even 300Hz by the expected official 7680x4320 UHD-2 release for the Tokyo 202 Olympic games.so we have a problem as that's only 6 years away so even this vesa/mipi spec is already under powered before its available.
p1esk - Wednesday, April 23, 2014 - link
In 6 years, you will be lucky if average Joe will finally stop buying 720p@60Hz TVs at BestBuy.Also, in 6 years all I hope for is 4k@120Hz, at reasonable price ($2k).
BMNify - Wednesday, April 23, 2014 - link
its been stated many times by NHK the inventors and bbc R&D that 2020 was the goal for full broadcasting service, the most resent here, theres a reason the specs called "rec 2020" 10bitpp/12bpp real colour when it was ratified.http://advanced-television.com/2014/03/18/japan-co...
"A statement from Japanese public broadcaster NHK on March 17 stated again that it was readying for test transmissions in 8K to start in 2016, “and full broadcasting service in 2020, the year of Tokyo Olympics and Paralympics.” NHK says there is an air of heightened anticipation for the evolution of a broadcasting style that gives an elevated sense of reality in various genres, such as sport, live music, film and drama. “8K is the next step in this evolution,” says NHK."
p1esk - Wednesday, April 23, 2014 - link
Sure, it's possible - in Japan.Average internet speed in US right now is 10Mbps. Let's be very optimistic and imagine in 6 years it will be 100Mbps. That's still barely enough for streaming 4k@60Hz. In fact, judging by the fact that streaming Netflix currently at 1080p is pretty much impossible regardless of your home internet connection speeds (even ignoring the fact that their 1080p is horribly compressed).
I will be impressed if by 2020 in US we will be able to stream movies in present BluRay quality.
BMNify - Wednesday, April 23, 2014 - link
"Average internet speed in US right now is 10Mbps" i can see how that might be a problem for you in the us , (didnt i read somewhere your govt payed a new subsidies plan for more fiber again), but everywhere else in the main land EU/UK/Ireland etc can get fast 50/100+ Megabits and theres also the latest DVB-T2 over the air broadcasting too that we know NHK/BBC r&d demoed again successfullyp1esk - Wednesday, April 23, 2014 - link
Actually, according to all the sources I could find with a quick google search, the avg internet speeds in UK are roughly the same as in US (which is surprising to me actually, comparing the size of the area to cover).BMNify - Wednesday, April 23, 2014 - link
well don't believe everything you Google on the internet :)i can tell you for sure i run 100+ Megabits from uk on virgin and also 50 Megabits in Ireland and depending where i was at the time anything from 30 megabits to 1Gb/s , anyway we are going off topic , UHD is coming very soon and 120Hz is the spec we need in any real UHD-1 rec 2020
simonpschmitt - Friday, May 9, 2014 - link
I actually have to interject here. I'm sitting here in Germany but it is much the same in many EU Countrys including the UK:Yes, we can get 50MBit/100Mbit Internet in many locales (as I have 50MBit in an metropolitan area)
But, many areas are lucky if they get 16MBit (A-DSL, as my sister has with ~2MBit)
Aditionally, many people who can get 50MBit/100MBit opt for 16MBit because they don't want to Pay more. even though it is just a 5€-10€ premium.
So the average speed in the EU is depending on country in the 5MBit to perhaps 15MBit range an on par with other industrial nations.
Silma - Wednesday, April 23, 2014 - link
You should be less forgiving with marketing bullshit and tell a lie when you see one.Visually lossless is a contradiction in terms and is highly deceiptive.
Even the sound industry didn't dare to go so far in bullshit.
DSC is lossy. Even if it is 100 % indistinguishable from a non compressed image by the average joe, it is lossy.
VESA loses all credibility spreading this BS. VESA, just say the truth.
Guspaz - Wednesday, April 23, 2014 - link
The term "visually lossless" is not a contradiction, it's a useful descriptor. It has a very specific meaning: the loss is not visible to humans.Murloc - Wednesday, April 23, 2014 - link
it means that it's not visible to humans, not that it's lossless. That so hard to understand? The image goes to the monitor and then gets in your eyes and your brain elaborates it.What is important is the final result, what's in the brain, not the outside world.
dabotsonline - Wednesday, April 23, 2014 - link
"... while DSC enabled devices are still some time off – the fact that the standard was just ratified means new display controllers still need to be designed and built... "Why will this be a concern given that DisplayPort 1.3 hasn't yet been announced?
"... we also expect DSC’s inclusion in the forthcoming DisplayPort 1.3."
This suggests that DSC will be included from the start. Unless DisplayPort 1.3 controllers are being worked on already?
What's interesting is that the recently-leaked Thunderbolt 3, scheduled for release alongside Skylake in autumn/fall 2015, only has support for DP 1.2. Therefore, we will probably have to wait for Thunderbolt 4, which will presumably be released in autum/fall 2017, for DP 1.3 and DSC support.
willis936 - Wednesday, April 23, 2014 - link
What it means is that your brand new Samsung Galaxy S6 will likely implement DSC.piroroadkill - Wednesday, April 23, 2014 - link
The diagram is totally awful.It's a diagram with vectors, rendered as a JPEG.
Looks horrible.
Not to mention the final output image looks like it has been run through a JPEG compressor on low settings about fifty times.
I don't want compression from someone who thinks this is a reasonable slide on a presentation.
TristanSDX - Wednesday, April 23, 2014 - link
This compression is totally nonsense. High res displays (4K, 8K) shows biggest advantage while displaying edges (fonts, lines, sharp borders of regions etc). And these elements are ALWAYS blurred by ANY non-loseless compression alghoritms. This way 4/8K will be visually downgraded to FHD (or less) with blurred edges.savagemike - Wednesday, April 23, 2014 - link
So we'll have a really clear high res look at crappy compressed information. I just hope it has a bypass with zero compression if using a display below a given factor. Then that factor will be my upward bounding limit for displays I buy.watersb - Wednesday, April 23, 2014 - link
Are gamers really all that different from audiophiles? I hope so.There are obvious things than can be done to result in dramatic compression of the frame buffer. You don't need 32 bits of color information per line. Worst case, each pixel is a different color. That's 8192 colors on a line. 13 bits...
(Reminds me of Amiga HAM mode.)
Is there anything at all in the spec that addresses panel self-refresh?
NikAwesome - Saturday, April 26, 2014 - link
Excuse me, and how many bits do you employ to code the dictionary for addressing 8192 colors. Bear in mind that each line can have a different dictionary. So, you are missing lots of bits in your mathwatersb - Saturday, April 26, 2014 - link
Hmm... Yes, you are right, I need a look-up-table for each line. But I won't need 32 bits for each pixel on the line. I still think I can get at least one order of magnitude reduction of data (factor of 10) with a per-line LUT... I will try it. If I'm wrong, I will post a reply!Communism - Thursday, April 24, 2014 - link
Sounds like the 1440p @ 120hz ASUS ROG Swift will be the last monitor upgrade I will get before the world devolves into full retard mode.androticus - Friday, April 25, 2014 - link
Don't use "the" with acronyms, it is clunky and makes the text difficult to read. If the reader is going to just pronounce the acronym and not try to unpack it into its components, then "the" is incorrect.iAPX - Tuesday, April 29, 2014 - link
DSC use DPCM, and it's hilarious to use this kind of "technology" for high-end monitors. Ghosting, Vertical lines that will change depending on pixels preceding it at each line, it has just no sense on 8K displays for professionnals!And on the mobile world, it is a non-sense too to put full HD displays on a smartphone or maybe this year Ultra HD/4K displays on a tablet. Drop the DSC and stay with non-marketing screens resolutions, this is the way to go!
All this thing to please marketing departments, it's just weird!
HisDivineOrder - Wednesday, April 30, 2014 - link
Wouldn't it be easier and better just to do the obvious and use a new spec to stop transmitting all that, "Make the same screen" over and over the cord instead?Just tell the monitor to keep displaying the same image until told otherwise. Boom, tons of savings right there. Imagine the savings on a tablet or a PC. Instead, what we have is a whole series of products built around VGA that continue to send data continuously for the entire screen when it'd be far better just to send the only part of the screen that changed.
Compression's not needed for that. Better monitors (or converters for existing monitors) are needed for that.
G-sync/Freesync shows us the way. But that'd involve no big news, right? So instead, VESA makes a mountain of a molehill and makes far more of it than should be, pushing into lossy codecs instead of doing the obvious things first.
pr1mal0ne - Wednesday, April 30, 2014 - link
As a FLAC snob, i am sad i will now have to be a mathematically lossless video snob as well, only here, it looks like i will not get the option. This is understandable for mobile and such, but on a desktop in my home where power and number of cords is not a limitation, i will always want to option of lossless. I hope VESA understands this.sonicmerlin - Saturday, May 3, 2014 - link
I'm curious how much compression could they achieve with a mathematically lossless codec?simonpschmitt - Friday, May 9, 2014 - link
@HisDivineOrder: What you are proposing is esentially a rudimentary, losless comression. (and it would be interframe so: lag, RAM, ... see previous posts.)@sonicmerlin: The problem with losless is that it has to be adaptive, witch means it could propably get really good compression rates most of the time. But, every losless compression has "perfect storm" data patterns were it can get no compression at all. Or even worse, it adds data.
When you are bandwidth-constraint you have to guarantee a compression factor to always stay inside the limit. You can't do that with a "perfect storm" pattern so you can't do that with lossless.
I just hope that for graphical work you can choose a lower refresh rate (i.e. 30Frames/sec) without compression.