X
Business

Why 4K UHD Television is nothing but a CES wet dream

Forget the costs of the 4K UHD displays themselves. The infrastructure challenges to deliver the content would be massive.
Written by Jason Perlow, Senior Contributing Writer
sony-4k-tv-xl

God Bless the Consumer Electronics Industry.

Without them, we'd have no innovation in gadgetry and tech toys. Without the companies that are showing their wares at CES, guys like me, James Kendrick and Matthew Miller would have nothing to oggle at. 

The technology that is making the biggest splash at CES and making guys like us salivate like rabid dogs right now are 4K UHD TV sets. 

What's a 4K UHD TV set, you ask? And why do you need one? Very good question, I'm so glad you asked.

4K is an extremely high video resolution that is currently being used in theatrical movie releases. When you go and see a new theatrically released film, and you view it in digital projection, you might be lucky enough to see it in 4K.

2012's "The Girl With the Dragon Tattoo" was one of the first films to be produced and distributed in this format. Approximately 17,000 theater screens on the entire planet have the proper equipment to display 4K films in their native resolution.

Digital films are still for the most part produced in 2K, which is roughly one quarter of 4K resolution, if you reference the graphic below.

A lot of the problems have to do with the fact that a lot of popular feature films are in 3D, and the frame rates/bit rates required in order to do 3D films in 4K would outstrip the capabilities of the projection and digital storage/playback technology in just about every 4K-equipped theatre.

So if this stuff is hard enough to do in a movie theatre, why is the consumer electronics industry rushing to put this in your living room?

At CES, SONY introduced a new line of Bravia XBR 900 LED 55" and 65" 4K UHD TV sets, with prices TBA. Last year's 84" model is currently priced at $25,000. Yep, these things are pretty pricey.

But boy, technologically, are they shweet. Just to get a better understanding of how cool these new UHD sets are, let's talk a bit about the existing HD standard and how much higher resolution 4K really is.

Broadcast HD digital television is transmitted at 1080i (interlaced) or 720p (progressive) resolution. That is, over the air, non-subscriber DTV is capable of 1080 or 720 lines vertical resolution.

DirecTV and FiOS and other subscriber services like iTunes and Netflix can transmit or push (via Internet-connected On-Demand) at a much higher quality 1080p (progressive) for selected premium pay-per-view content.

1080p is 1920x1080 pixels per frame. To store a 1080p feature film in high quality, it requires anywhere between 25 and 50 Gigabytes per movie on a Blu-Ray disc, depending on the bit rate and the encoding method used.

4k-blowup

And even so these films are still compressed from their original theatrical 2K format, but using a "less-lossy" encoding method so you still achive very high video quality.

Are you starting to understand the scale of the problem yet? No? Well keep following.

Naturally, services like DirecTV, Netflix and iTunes cannot stream "less lossy", high-quality films like you can store on a Blu-Ray, because nobody in their home has the kind of Internet broadband that would be needed to achieve that kind of sustained data rate required to do it consistently and reliably.

Instead, those companies send highly compressed data streams and you end up seeing "artifacts" and pixelation in the films.

You need to be able to achieve 5 Megabit to 10 Megabit per second data rates in order to sustain high quality "lossy" video with a 1080p film, whereas a 720p film's requirements are less than half of that.

Never mind if your cable connection is 20 Megabits, your actual throughput to Amazon's cloud to stream Netflix or to Apple's iCloud will be less than that because of overhead, routed traffic between the links and contention with other people accessing the service.

So what precisely would you need to store a 4K movie, so you can play it back in native, uncompressed format on your UHD set, assuming that the costs of these things will go down tremendously in the next 5 or 10 years?

If we want to use the digital workprint of The Girl With the Dragon Tattoo as a benchmark, every single frame of the film is 45 Megabytes in size, and has an approximate resolution of 4352x2176.

The two hour and thirty-eight minute film has approximately 230,000 frames at 24 frames per second.  If we do the somewhat inexact back of the envelope math, 45 megabytes per frame * 230000 frames equals 10,350,000 Megabytes, or 10,107 Gigabytes, or just under 10TB

That is what is needed to store a single 4K feature film shot at 24 frames per second, at full uncompressed workprint quality.

Oh you want to watch The Hobbit at the director's original 48 frames per second? Double it.

The only way to make it practical to distribute films in this format would be to use the industry's existing methods of using lossy compression to deal with the problem, trading off computational complexity for reduced data rates, such as with SONY's new XAVC recording technology, or the High-Efficiency Video Coding (HVEC) compression method that is currently in draft by the MPEG group.

Even with these advanced compression techniques we're still talking about severe storage requirements (128GB to 256GB per movie) and probably something along the line of 100 Megabits per second sustained data rates to stream it over the Internet so we can play it at 30 frames per second, so it doesn't look like complete crap on that expensive 4K set.

Oh did I mention that 4K is going to eventually be replaced by 8K? Which is four times the resolution of 4K? Yup, the maxi-zoom dweebies in the labs are already figuring out how to get that into our theatres and living rooms next. That's assuming CES even lives to see 2025, anyway.

Most people have enough problems getting anything faster than 25 Megabits into the home, let alone 100 Megabits or even 1 Gigabit.

In my ordinance-crazy, permit-loony bureaucratic town in South Florida, nobody wants to break up the sidewalk to bring fiberoptics to the home (FTTH) and that would at best give me 100-150 Megabits of sustained data rates for that 4K video stream, assuming the content was cached on the edge of my ISP's network using a CDN like Limelight or Akamai.

So instead I have to live with VDSL, which is an 18 Megabit downlink technology. That's way, way short of what 4K On-Demand would need.

Unless you live in South Korea where Gigabit to the home is fairly commonplace, we're talking Star Trek-level stuff, people.

So you're probably wondering, if the native data storage and broadband required to pull it off isn't ready for the home, what exactly are you supposed to do with that $25,000 TV set?

Well, for the time being, you play existing Blu-Ray movies on a player that can upscale. In other words, a pixel quadrupler. To quote my contact at SONY, 

"At the moment, there’s no physical 4K format for the movies. That’s why we’ve launched the 4K server, and announced the new distribution service + media player to bring 4K to consumers. It’s up to the industry to standardize the format now."

What is this media player, you ask? Well, SONY hasn't released a ton of details about it yet, but it's essentially a computer that is pre-loaded and given to you on loan with 10 feature films on it.

Presumably, if you want new movies loaded on it, somebody has to come visit your house and perform service on it, or a single film download might take days. SONY hasn't really detailed how the distribution service for your on-premises equipment will work.

Will the US broadband infrastructure ever be ready for 4K on-demand video? Talk Back and Let Me Know.

Editorial standards