Confused about the difference between 4K and Ultra High Definition (UHD) TV? You’re not the only one. Despite attempts to standardize the terminology, 4K and UHD are often used interchangeably and incorrectly. Let’s look at where that confusion comes from and what different image types mean for the AV industry.
First, it’s important to understand the differences between standards for digital television (set by the Consumer Electronics Association) and digital cinema (set by the Digital Cinema Initiative). While they’ve been similar for each “level” of quality improvement, the standards are not the same because of the difference in aspect ratios between TVs and theater screens. For example, consumer HDTV has a 1920×1080 resolution while DCI 2K is larger, at 2048×1080.
This difference continues into 4K. The native resolution for digital projectors under the DCI 4K standard is 4096×2160. For display, pixels are cropped, depending on the aspect ratio, to a resolution of 4096×1716 or 3996×2160.
What’s often referred to as “consumer 4K” or “quad HD” and is officially known as Ultra HD comes in at a slightly lower resolution: 3840 x 2160. Since, like cinematic 4K, it has four times the pixels of its 2K sibling, early proponents called it 4K.
The 4K name has proven sticky, despite CEA’s 2013 efforts to move the industry to “Ultra HD” (technically an umbrella term for both 4K and 8K consumer TV standards). That’s why you’ll see TVs marketed as “4K Ultra HD,” or even just 4K–even though a TV won’t technically meet the 4K DCI standard.