Obtained element technologies in Next-generation video?

Separately, the movie industry has been quick to adopt similar advances, and video in fields such as medicine, security, and education now incorporates CG (Computer Graphics), VR (Virtual Reality), AR (Augmented Reality), MR (Mixed Reality), and other innovations. Another characteristic of the current move to next-generation video in these areas is that, unlike the previous period of SD to HD migration, there have been updates in individual standards of (1) resolution, (2) frame rate, (3) color gamut, and (4) dynamic range, as discussed below. It may be easy to describe ideal video as "more natural and lifelike," but in fact, this depends on ensuring a balance of each of these elements.

Let's see how video has evolved through these updated standards.

(1) Higher Resolution

UHD doubles the resolution of full HD, offering 3840 x 2160 instead of 1920 x 1080, while 8K offers four times the resolution, at 7680 x 4320. (Full 4K resolution according to the DCI specification is 4096 x 2160.) At these higher resolutions, video can be displayed with smooth gradation on larger screens, without pixelation.
Minimum viewing distance is defined as the closest to a screen that viewers with 20/20 vision can watch without discerning pixels, which represents the ideal distance for watching TV. For full HD content, this distance is estimated to be about three times the screen height. At this distance, the viewing angle for the corresponding screen width is about 33°. In contrast, because this resolution is doubled for UHD, the minimum viewing distance is 1.5 times the screen height, yielding a wider viewing angle of 66° for the corresponding screen width. The resolution is quadrupled for 8K, making the minimum viewing distance 0.75 times the screen height. For the corresponding screen width, the viewing angle is quite broad, at 132°. This means that at the ideal viewing distance for 8K, the screens can fill the viewer's field of vision, but pixels cannot be discerned.

(2) Higher Frame Rate

Next-generation broadcasting has abandoned interlaced video in favor of progressive, at a frame rate of up to 120 fps. Moving to progressive scanning eliminates interlacing artifacts and makes each frame seem clearer and sharper. Higher frame rates reduce blurring in scenes with fast movement. In general, movement seems more fluid. In this way, the new standards yield benefits not only in higher resolution but also in smoother movement.

(3) Wider Color Gamut

The broader range of the new color gamut aligns it more closely with the range of human vision. In image areas that could only be reproduced with the same color in the previous gamut, subtle tonal differences can now be distinguished. In conjunction with high dynamic range, this enables subtle shading to be reproduced well, even in shadows or highlights. Coverage of colors in nature reached only about 75% with the conventional color gamut of ITU-R BT.709, but with ITU-R BT.2020 as used in next-generation broadcasting, this coverage has expanded to nearly 99.9%.

(4) High Dynamic Range

Dynamic range refers to the extent of tonal gradation between the darkest and brightest image areas captured by image sensors, or the difference in luminance between the blacks and whites produced by displays. Moving to high dynamic range (HDR) preserves gradation in shadows and highlights, which helps avoid scenes that look under or overexposed. Even scenes with significant contrast look more lifelike.
Because HDR was not included in the ITU-R BT.2020 specification, ITU-R is working on international technical standards for HDR. Companies have enhanced imaging and viewing by adding HDR on their own, but ITU-R was recommended ITU-R BT.2100 on July 2016. In any case, because accurate color reproduction depends on both color gamut and dynamic range, these concepts must be managed collectively in color volumes that add the dimension of luminance to the range of a color space.

As specifications improve video quality through higher standards in these elements, they also require more bits in quantization—greater bit depth. In the parameter of bit depth, the number of bits determines the number of shades or levels of tonal gradation available when calculating colors. Conventional HD broadcasts have an 8-bit palette of 256 shades, applying 8-bit or 10-bit internal signal processing. For many years, display colors consisting of 256 levels of RGB colors (yielding about 16,770,000 colors in all) have been described as "24-bit full-color." This has been updated in next-generation broadcasting to a 10-bit display palette of 1,024 shades, applying 10-bit or 12-bit internal signal processing. With 1,024 levels of RGB colors, the total number of displayable colors reaches nearly 1,073,740,000, representing 30-bit color. Processing with more bits enables reproduction of faint colors and finer gradation, which helps avoid banding.

PAGE TOP