Eagerly awaited across the tech industry, this week the Alliance for Open Media (AOMedia) has published the first complete version of the bitstream and decoding process specification for their royalty-free AV1 video codec. The release of the AV1 1.0 spec will enable backers of AOMedia to add support for the technology to their products or services, including taking the all-important step of finalizing the designs for the low-power hardware decoders critical for driving the codec's adoption. At least initially, AV1 will be used primarily for streaming video and user-generated content as an alternative to HEVC and its ongoing royalty disputes, but eventually adoption of AV1 may expand to other applications.

The AV1 open-source video codec was developed with 4K+ ultra-high-def resolutions, HDR, and wide color gamut in mind. Among the key features the new codec, AOMedia mentions a 30% more efficient compression algorithm compared to existing methods, predictable requirements for computational capabilities of hardware, and maximum flexibility and scalability. The backers of the AV1 want the codec to be ubiquitous across devices and platforms, therefore expect it to be supported not only by major chipmakers, software designers, and service providers, but also by leading makers of consumer electronics.

AOMedia does not disclose key technological peculiarities of the AV1 video codec in a short whitepaper form, meanwhile parsing through a 600-page bitstream and decoding spec for developers does not necessarily help to explain all the peculiarities of the tech in general. Therefore, I am going to limit technical details about the AV1 to a necessary minimum here.

On a high level, the AV1 is conceptually similar to existing codecs, such as H.264 or H.265. AV1 uses the same basic elements as various codecs have used for well over a decade: block-based coding, variable block sizes (up to 128x128 pixels), block motion compensation, intra-frame compression, forward-integer transform and so on. Meanwhile, since we are talking about compression algorithms more efficient than existing ones, it is natural that the AV1 has a number of advantages over contemporary codecs.

The AV1 performs internal processing in 8, 10 or 12 bits per sample precision, it also supports all three widespread types of chroma subsampling (4:2:0, 4:2:2, 4:4:4), and virtually all major color gamuts and formats (sRGB, BT.2020 (both 10-bit and 12-bit), BT.2100, etc.). The BT.2020 and the BT.2100 recommendations include support not only for 3840×2160, but also for 7680×4320 (8K) resolution, so the AV1 is technically ready for the next-gen monitors and TVs.

AV1 Profiles
seq_profile Bit Depth sRGB Gamut Support Chroma Subsampling
0 8 or 10 No YUV 4:2:0
1 8 or 10 Yes YUV 4:4:4
2 8 or 10 No YUV 4:2:2
2 12 Yes YUV 4:2:0
YUV 4:2:2
YUV 4:4:4

Speaking of displays, it is necessary to note that the AV1 was designed to be compatible with existing interconnections, such as DisplayPort, eDP, HDMI and so on. That said, the technology should also be compatible with contemporary content protection technologies.

The publication of the AV1 spec 1.0 is merely the first step towards adoption of the technology by the market. AOMedia expects content creation tools and desktop browsers to begin to roll out support for AV1 later this year. To ensure this, AOMedia released an unoptimized/experimental AV1 software decoder and encoder for use in software applications. Then, sometimes in 2019, the consortium anticipates select chips and programs to support the tech. More widespread support of the AV1 along with adoption by software is projected for 2020.

Speaking of adoption, the list of AOMedia members includes a variety of influential companies, including Apple, Amazon, AMD, Arm, Broadcom, Facebook, Google, Hulu, Intel, IBM, Microsoft, Netflix, NVIDIA, Realtek, Sigma and many others. These companies either control huge ecosystems themselves, or develop chips that are used by hundreds of millions of customers worldwide. Their support will ensure widespread adoption of the AV1 in the next decade. In the meantime, AOMedia has already started R&D for the AV2, which is to succeed the AV1 codec.

Related Reading:

Source: AOMedia

Comments Locked

71 Comments

View All Comments

  • tuxRoller - Saturday, March 31, 2018 - link

    Other than what others have mentioned, I'm not sure what you're saying.
    It's your main concern that the org isn't interested in itu status, and thus, uncertain ott?
  • GreenReaper - Sunday, April 1, 2018 - link

    They managed to get VP9 into PC hardware - I'm sure they can manage AV1 more generally.
  • iwod - Friday, March 30, 2018 - link

    May be I will have to go through twitter to have this done and corrected. But Av1.0 has not been released yet. Even if you click on the Spec PDF link it still say it is draft document. Apparently the PR and marketing team when on without even asking the engineering team. They wanted to report this about 10days before NAB Show.
  • tuxRoller - Saturday, March 31, 2018 - link

    Hey! I recognize you from d9:)
  • Lolimaster - Saturday, March 31, 2018 - link

    We are a time where future video cameras (from smartphones-drones to pro-cameras) should have a certain minimal quality certification for day/dark + HDR/10-12bits.

    Consumer win having actual quality video delivered to their new OLED tv's.
  • ಬುಲ್ವಿಂಕಲ್ ಜೆ ಮೂಸ್ - Saturday, March 31, 2018 - link

    What I really want is NVidia support to compress and decompress to whatever quality the video card can handle for user generated content

    Bluray DRM can be recognized for blocking commercial content and yet allow me to record whatever is on my screen

    Recording 4-8K with ZERO compression should be allowed if the hardware can handle it

    Max compression for 4K content should be set to whatever the user wants

    For example 100% usable compression in software might actually be set to 64% actual compression in hardware and be displayed as 100/64 compression

    Setting the max usable compression would prevent me from going too far in a video where its not initially noticeable and then ruining the remainder of the video where it is noticeable

    I'd like to generate my own content without the DRM Nazi's ruining my day

    I'd like to EASILY record whatever I want at any usable resolution and compression level the hardware can handle

    4-8K camcorders should output directly to a Videocard input (yes, I said INPUT) without any compression whatsoever and let the videocard do all of the heavy lifting

    whether recording desktop output or camera input, WE NEED MORE OPTIONS!
  • Santoval - Sunday, April 1, 2018 - link

    "Recording 4-8K with ZERO compression should be allowed if the hardware can handle it"
    Among your many wild dreams/demands that got my eye the most. Although much or most consumer (and *all* prosumer) video recording hardware is capable of zero compression that will never happen due to a thing called "market differentiation". In plain words that means that the camera manufacturers love the very fat profit margins they get from professional video cameras so they would never risk them by moving zero compression down the food chain.

    That looks like it is an unwritten rule that is somehow obeyed even by non professional camera companies. Video/movie professionals require uncompressed video because they can manipulate it, color grade it, edit it, insert special effects in etc much more gracefully. In recent years uncompressed video started being introduced to mid-range prosumer DSLRs (which record video but are not video focused) but that is the current lowest limit I am aware of.
  • ಬುಲ್ವಿಂಕಲ್ ಜೆ ಮೂಸ್ - Sunday, April 1, 2018 - link


    Using the calc @ >
    https://www.extron.com/product/videotools.aspx

    4K @ 60fps X 16bit 4:4:4 = 4.45GB/sec
    8K X 60fps X 16 bit 4:4:4 = 17.82 GB/sec

    can you show me any consumer video camera outputting Raw video at more than 4 Gigabytes per second?
  • ZeDestructor - Tuesday, April 3, 2018 - link

    That calculator's math is not correct for videocamera raw footage.

    1. That calculator assumes 3 subpixels per pixel. In almost every camera on the market, be it a basic potato you'd have to pay people to take all the way to the $70k RED Monstro you have a bayer pattern sensor with 1 subpixel per pixel, unlike monitors. This significantly reduces the amount data you actually need to record when running with raw.

    2. It uses math that produces the bandwidth needed to drive a monitor at 4K using worse than CVT timings, and that adds a ton (around 20-24%) of overhead that you don't have in a camera raw In a camera raw, you have the raw frame, some metadata about exposure, lenses, shutter speed, location etc and timecodes. Besides, anything past 2560x1600 is using CVT-R2 timings, which have much lower overhead (less than 5% for 4K and up)

    Here's the actual numbers for raw 4K/8K capture, frames only, so add some overhead for metadata (think <10mbit/s, since you don't need blanking intervals here):

    3840 * 2160 * 60fps * 16bit * 1 subpixel @ RGB/4:4:4 chroma: 7.96Gbit/s = ~1GB/s
    7680 * 4320 * 60fps * 16bit * 1 subpixel @ RGB/4:4:4 chroma: 31.85Gbit/s = ~4GB/s

    This all fits just fine using a few SDI cables (remember, those only go up to 12Gbit/s and that's what they use in cinema, TV and live broadcast) to get raw footage off the camera and into some external capture device.

    You can also do that over HDMI or DP just fine (after de-bayering). In fact, this is done right now: the higher-end DSLRs and video cameras all have HDMI output, so you can just feed the live signal into a regular ordinary HDMI capture card and cap it raw that way, but you'll only see that on higher-end kit, because it's expensive to have fast IO in a space as small and oddly-shaped as a camera.
  • bill44 - Saturday, March 31, 2018 - link

    Dynamic Metadata support? HFR supported?

Log in

Don't have an account? Sign up now