It seemed that nearly everyone at NAB was settled on using HEVC (High-Efficiency Video Coding), aka H.265, as the codec for UHD. It’s often been said that HEVC offers 50 percent greater efficiency than the previous-generation H.264, aka AVC (Advanced Video Coding), resulting in files that are half the size of comparable AVC files and require half the transmission bandwidth. More importantly for UHD, files with twice as much data can be compressed and transmitted with the same specs as current HD files.
To be fair, UHD includes much more than twice as much data as HD—there are four times as many pixels, and things like higher bit depth and frame rate increase the payload even more. And at several seminars during the show, it was revealed that current implementations of HEVC yield roughly a 30 percent increase in efficiency and that 50 percent is a theoretical maximum. Obviously, UHD is going to require a lot more bandwidth than HD no matter how you slice it.
A number of demos showed UHD content at 60 fps encoded with HEVC at a bitrate of 15 Mbps, which is pretty remarkable, considering that Blu-ray goes up to 40 Mbps or so. Still, most homes in the US have much lower downstream bandwidth—according to Netflix, the downstream bandwidth provided to its customers by their ISPs averages no more than 3 Mbps. Other studies put the average downstream bandwidth to American homes in the 6-7 Mbps range, but either way, only a minority of people in this country can receive a sustained bitstream at anywhere near 15 Mbps.
I saw two companies at NAB demonstrating bitrate-reduction technology that promises to bring relatively high-quality streamed UHD to many more homes. One is an Israeli company called Beamr, which has developed an objective measurement to detect visible artifacts from block-based encoders such as all forms of MPEG, including HEVC. The idea is to take a video file, encode it at a high bit rate, compare it with the original, and if there are no visible artifacts detected, encode the original again at a lower bitrate, compare it with the original, and so on until visible artifacts are detected. Then the final encoding is performed at the lowest bitrate at which no artifacts were detected.
The demo consisted of several split-screen clips on a 65″ Seiki UHDTV—one side was encoded at a “normal” bitrate and the other was encoded at the bitrate determined by Beamr’s iterative process. (The clips were played from a computer as raw YUV 4K files; the TV was not doing any upscaling.) Each clip started out with no identification of which side was which—they weren’t always on the same side—and only at the end of the clip was the bit rate on each side revealed. All but one of the clips was in HD using AVC, while the remaining clip was an animation in UHD encoded with HEVC.
On this clip from the animated short film Sintel (UHD), Beamr achieved a 40% reduction in bitrate over a “conventional” HEVC encode. The two live-action HD encodes resulted in a 25% reduction, while the animated Big Buck Bunny was at 40%. Before the bitrates appeared on the screen, it was very difficult to tell the difference between them; I saw many people misidentify which side was which.
Beamr says its software is installed at three major movie studios, and at NAB, Mark Romberg of Amazon Web Services (AWS) announced that the Crackle online-content provider is using Beamr’s technology on its AWS cloud service.
The other company demonstrating bitrate reduction was Faroudja Enterprises. Anyone who’s been following home theater for a while knows that name—Yves Faroudja is a legend in our world, having created one of the best video processors ever made. Remember “line doubling” and “line quadrupling”? That was Faroudja back in the standard-def era, and it was the gold standard for video quality at that time.
These days, Faroudja is working on reducing the bitrate of encoded video with a system he calls VBR100, but he’s taking a different approach than Beamr. The input file passes through a pre-processor that cleans up the image as needed on a per-pixel basis in the context of the surrounding pixels, after which 50 to 75 percent of the data are removed. This information isn’t discarded—it’s fairly heavily compressed and sent along what’s called a support channel. Amazingly, it’s a real-time process, with most of the latency in the conventional compression and decompression.
If UHD is going to be successful, it must provide a noticeably superior experience compared with HD, which will be extremely difficult if the primary delivery medium is online streaming. With companies like Beamr and Faroudja working on reducing the bitrate without losing image quality, many consumers might just be able to appreciate the difference after all.