WTF is HDR (High Dynamic Range) for Video, TVs and Projectors? | Tech Reddy

[ad_1]

Over the years, TVs, video formats and broadcast standards have progressed from Black and White to Color, from analog to digital and from low resolution to detailed images. The transition from analog television to digital HDTV (High Definition Television) represents a huge improvement, especially with full HD 1080p content. We went from 240 lines of active data to over 1,000. The difference was immediately apparent.

Now there are TVs, video content, projectors and live broadcasts of “Ultra HD TV” that can include 4K or 8K resolution images. A 4K Ultra HD image includes four times as much detail as a 1080p HD image, with more than 8,000,000 pixels (picture elements). An 8K image includes more than 33 million pixels. Yet this improvement in resolution itself is not as impressive, individually speaking, as the transition from analog TV to HDTV was. On an average size TV screen, from an average viewing distance, the difference between 1080P HD and 2160P 4K is actually very subtle. But while pixels began to proliferate, another development emerged that would help bring television and display performance to the next level: HDR.

Not just more Pixels: better Pixels

HDR (High Dynamic Range) it represents the ability to capture and reproduce an image with a large range between extreme blacks and bright colors. The “capture” part is done in film and video cameras and control houses where TV programs and films are made. The “reproduction” area is created on a display device: projector, monitor or television. For HDR to work, it must be present in both the subject and the display. It is unity.

With the increase in dynamic range, HDR content and displays also usually (but not always) include something called WCG (Wide Color Gamut). This is an increase in the number of different colors that the display can reproduce. As HDR is bright, so WCG is color reproduction. With Wide Color Gamut support, displays are better able to reproduce skin tones and other colors for a more realistic display. By combining a greater range of brightness and a greater range of colors, display devices come closer to being able to capture what we see in nature.

sdr-vs-hdr-psc-900
When HDR (High Dynamic Range) and WCG (Wide Color Gamut) are combined, viewers see a vivid, realistic display on their TV and projector screens. Image courtesy of projectorscreen.com.

To get an idea of ​​what HDR is, look at clouds during a bright day. You’ll probably notice the change in color and light in the clouds, and the many shades of white and gray that flow together seamlessly. But when you look at the sky on your TV screen while watching a DVD, you’ll just see the same pair of dark “things”. There are not enough “white” (television) sorting steps to show variety. Details in a bright subject are often called “highlights.”

It’s also easy to see the limits of the dynamic range in dark scenes. Non-HDR TVs (and content) have trouble reproducing detail in shadows. This is enough to be called “shadow information.” Once the screen is dark enough, the details in that darkness fade into the background. But HDR isn’t just limited to the darkest or brightest parts of the screen. With HDR, tropical fish look spectacular and crowded bazaars come to life with a variety of colors for patrons and products alike.

What_iHDR_HDR_vs-2-c
HDR allows subtle details in nature to be enhanced for home viewing. Image courtesy of ViewSonic.

What came before HDR is now called SDR (Standard Dynamic Range). SDR content, such as old TV shows, DVDs and Blu-ray Discs, were designed for older display technologies, such as CRT tube televisions. This was limited to 100 nits. A “nit” is a standard measure of light or brightness, equivalent to the light produced by one candle. Today’s LED/LCD and OLED TVs can reach peak brightness of up to 1,000 nits, 1,500 nits, sometimes even more. But all those nits are useless if there is no content to take advantage of them. So HDR began to make its way into new recording equipment and masters and new standards were developed so that content creators and show makers could speak the same language.

Today HDR content is available on Ultra HD Blu-ray Discs, and many popular streaming services including Netflix, Amazon Prime Video, HBO Max, Apple TV +, Disney + and many others. Also almost all TVs labeled “4K TVs” or “Ultra HD” TVs include some form of HDR on board. But just because a TV includes HDR, it doesn’t mean it will look amazing. There are different flavors of HDR, some better than others. And no display device is perfect, and some “HDR TVs” are much worse than others.

Sony BRAVIA XR MASTER Series A95K QD-OLED TV on rear stand near wall
Sony’s BRAVIA XR MASTER Series A95K QD-OLED TV includes HDR10, HLG and Dolby Vision HDR for superior performance when viewing 4K Ultra HD content. It received our top rating for 4K TVs in 2022.

Static vs. Dynamic HDR

There are two basic types of HDR available: Static HDR and Powerful HDR. Static HDR uses a fixed range of values ​​for the dynamic range or brightness of every TV show or movie. Examples of static HDR are HDR10 and HLG (Hybrid Log Gamma). HDR10 is the most common form of HDR. HLG was developed for live broadcasting and is not yet widely used. Dynamic HDR adjusts the range of brightness values ​​dynamically (usually visible through the display), based on the specific content on the screen at any given time. Examples of Dynamic HDR include Dolby Vision and HDR10+. Dolby Vision is widely used today on Ultra HD Blu-ray Discs and streaming services.

For a “perfect display” cable with full black levels and a maximum brightness of 1.6 billion nits (daylight), static HDR will be perfectly fine. But, again, no show is perfect. And for these imperfect displays, dynamic HDR can produce more, more, well… “dynamic effects”.

An example would be a movie with a bright scene of say, a couple on the beach. In such a situation, content designers need to be able to provide detail and light grading in a wide range of lighting conditions. This will allow the viewer to see details in lighted facial expressions without blurring (the details, not the face). But later in the film, the couple may be walking down a dark alley and the director wants the audience to see the eyes of the beast lurking in the shadows. Dynamic HDR metadata allows filmmakers and creative engineers to assign particles (and spirits) where they need to go to get the most out of every scene.

Advertising. Scroll to continue reading.

dolby-vision-game-vs-sdr
Dolby Vision HDR technology is also used in video games to create a dynamic and realistic gaming experience. Image courtesy of Dolby.

Another feature of HDR on a display device that can make or break its ability to display HDR content well is called “tone mapping.” A piece of content can be capable of 2000 nits peak brightness. But some OLED TVs can reach 850 nits and an LED/LCD set can reach its limit at 1200 nits. In order to create the best, most accurate representation of content, each TV must “map” the brightness and color value it sees in the content to its actual capabilities. If not, you can lose a lot of detail in the picture – on an OLED TV, any content at 850 nits up to 2000 will be “clipped” to one shade of white. This is bad. A good HDR tone map simulates content that cannot match the capabilities of a TV or projector, preserving detail in both bright highlights and dark shadow details, and everywhere in between.

There are a number of displays, both flat panel TVs and projectors that perform very well with HDR content. Here are recommendations for the best HDR displays available right now:

[ad_2]

Source link