The Age of Ultra HD
Ultra HD/4K televisions have been sold since 2012. However, it took a full six years before the first UHD Blu-Ray player hit the market. In the meantime, streaming boxes such as Roku and Amazon Firestick came out with 4K devices allowing for streamed 4K content, which was most often found on Netflix and YouTube. More and more UHD content has become available as cable and satellite providers have increased their offerings, and broadcasters are on the verge of transmitting Ultra HD, through a new standard called ATSC 3.0. So, we are finally seeing the promise of Ultra HD, which is great news.
Since we are now seeing Ultra HD content finally reach its potential, The questions now pertain to features and bandwidth, and what will be needed to ensure it all works. How will developments such as HDMI 2.1, HDCP 2.2, High Dynamic Range, wide color gamut, high frame rate and even 8K change things, and what does the future hold for HDMI and HDBaseT?
We live in the connected age, but connectivity takes many forms. What works now may not work next year and escalating formats such as Ultra HD mean exponentially increasing data loads. It’s crucial for AV integrators to stay informed and prepared in order to ride the crest of this coming Ultra HD wave.
Ultra HD Evolution
Early Ultra HD was a little confusing, as it came without source devices or content, and there was even some uncertainty about what to call it. Naming convention has settled with Ultra HD, although some may refer to it as 4K. In this article, we will stick to UHD, which has a native pixel resolution of 3840 x 2160. Since most manufacturers mastered the number of UHD pixels on a display starting in 2012, the last half a decade has been spent on improving the quality of the pixels, be it their color, refresh rate or dynamic range.
The combination of these improvements is focused on “immersion”, which provides a more realistic visual experience never experienced before. This is consistent with, and complemented by, the immersive ‘object based’ audio formats such as Dolby ATMOS, DTS:X and Auro3D. With video it’s a new set of three letter acronyms – WCG (wide color gamut) based on the Rec 2020 standard, HFR (high frame rate), and HDR (high dynamic range)– augmented by such terms as HEVC (high efficiency video coding), all protected by HDCP 2.2, and displayed with quantum dots, Nano crystals and even found on gigantic microLED screens.
The key to installing UHD displays is a proper understanding of the new features and technologies, how they co-exist, and what effect (if any) each will have on the data payload. Understanding what will be going down the proverbial pipe will enable you to know the size, type and compatibility of pipe to install from end to end in the AV system.
See Kordz HDMI Cable Guide for specifics
Understanding the technical specifications of HDMI 2.1 is paramount. The areas which systems integrators must pay attention to are:
Resolution – The number of pixels on an image at one point in time. Resolutions for HDMI 2.1 include 1920 x 1080(FHD), 3840 x 2160(4KUHD), 5120 x 2160(5K Wide UHD), 7680 x 4320(8K UHD) & 10420 x 4320(10K Wide UHD)
Frame Rate – This refers to the number of times an image refreshes per second, referred to as FPS (frames per second). HDMI 2.1 enables refresh rates up to 120 FPS
Color Depth (bits) – In digital television, simply put, pictures are based on the three primary colors: Red, blue and green. Each of the three colors have a certain amount of shades based on the bit value. There are three different bit values for red, blue and green 8, 10, 12, 14 and 16bit color. This equates to trillions of colors HDMI 2.1 capable displays can reproduce.
Data Rate – Since its inception in 2002, each new version of HDMI has increased the amount of data the HDMI cable must be able to pass. With each major iteration, the maximum data rate has increased, going from 4.95 Gbps to 10.2 Gbps, to 18 Gbps to the 48 Gbps requirement introduced with HDMI 2.1