4K UHD TV needs big pipes, not a pipe dream

Summary:The challenges in delivering extremely high-definition content over the Internet is not an issue of display cost, it's the limitations of our existing broadband infrastructure.

Unless you've been living under a rock--or have been blissfully unaware of the goings on at this last CES, you'd know that the products at the front and center of every major consumer electronics manufacturer (save Apple) have been the latest and greatest 4K, ultra-high definition (UHD) TV sets. And right now, they are crazy expensive, starting at around $25,000 each.

4k-vortex

I'm not here to argue that people won't own 4K UHD sets, or that we won't see 4K being used in computer displays or even tablets in the near future. In fact, I'm absolutely certain that the prices of these things are going to come down and become commodities and low margin products--just like the current generation of 1080p HD displays are today.

In three years or less, I won't even be surprised to see 4K screens on a full-sized iPad, a 10-inch Android device, or even ultrabooks and Windows RT tablets.

My issue is not the price of the screens themselves, it's how the content that will be delivered to the displays will be achieved. And right now, if you examine the state of consumer broadband in the United States, most households are barely able to stream 720p movies reliably, let alone 1080p, which is Blu-ray quality.

To move 4K movies across the Internet, we're going to need to move bureaucratic mountains at the state and municipal government level to get gigabit connectivity to the last mile in every major metropolitan area, unless we are prepared to distribute content on 128GB high-speed flash drives at Walmart or figure out how to free up broadcast spectrum that doesn't exist.

But the 4K technology is so much better than what we have now, right? Well, let's think about that for a bit.

Fundamentally I do not believe there are huge problems with today's TVs themselves, if you look at the entry-level and middle-market end of the scale, where the majority of units are being sold.

Since the digital transition during 2008-09 consumers have enjoyed a very high level of content quality overall compared to what existed before, and I do not think most of us would choose to go back to analog SD.

Now that being said, we are severely under-utilizing the capabilities of the current installed base of HDTVs.

All of this has to do with the fact that to get the majority of that content distributed, we have to leverage the limitations of the existing broadband Internet infrastructure that is lagging far behind the capabilities of our existing content-playback devices.

Very few people have been using the 1080p capabilities of their TV sets and set-top boxes because their content suppliers restrict much of this to pay-per-view on demand, and to really take advantage of quality 1080p content you have to use Blu-ray discs because nothing is over-the-air broadcast in this resolution.

Also read: Why 4K UHD Television is nothing but a CES wet dream

Companies like DirecTV are just starting to think about dedicating channels for 1080p broadcasts. It's not even on the radar for the broadcast networks in the United States.

Many households still do not have Blu-ray players to leverage their 1080p capabilities on their TV sets, nor is it as convenient a medium as Internet streaming from a device like an Apple TV, a Roku, or from embedded Netflix capabilities in "smart" TVs.

And while we are on the issue of smart TV, I'd like to point out that nobody cares about smart TV. Yes, people will expect that these features will be embedded in their TV sets, but nobody wants to pay extra for them and there is no standardized interactive TV content that anyone cares about. Everyone uses different content providers to get their subscriber material and they all have different UIs.

People just want to watch their shows, period, not interact with them. If any of that activity is going to occur it will be on mobile devices like tablets and smartphones that will simply replace the remote controls for DVRs and other set-top boxes that are in use now.

Let's get back to the issue of content delivery and image quality on existing sets using broadband-based content distribution.

Internet-distributed 1080p and even 720p content has to be heavily compressed in order to be pushed by on-demand services such as Netflix, Amazon Video, and Apple TV, so the amount of visible artifacts during playback is significant and is qualitatively not comparable to Blu-ray discs.

Most of these problems can be attributed to the broadband connection to the last mile and network congestion when attempting to access streamed video from the content-delivery networks (CDNs) that services like Netflix and Amazon use.

We will need gigabit or higher broadband to the home to make 4K content transport viable in the Internet, and obviously the electromagnetic spectrum cannot be expanded, so we are going to need to make massive improvements in digital multiplexing on existing DTV channels, freeing up existing spectrum and considerably advancing compression technology to even think about moving 4K over the air.

But it is not just the home broadband that needs to be beefed up in order to accommodate the much larger data streams. How will higher resolution impact content creators and infrastructure providers?

The content creators are going to need extremely powerful workstations and server farms to process the data. Take a look at what Weta Digital, the studio that produced The Hobbit uses. That should give you some idea.

Every uncompressed frame of data is going to be around 45 megs apiece, and then you are going to need serious compute power to do the compression and create the digital work prints, never mind having 100GB networks in your datacenter and 10GB to the workstation to move data around.

That's the kind of infrastructure TV studios are going to have to buy if network television and cable TV premium content providers have to get into this game. The storage and network companies like Cisco, EMC, NetApp, IBM, and HP are also going to get rich beyond their wildest dreams if this technology enters wide adoption.

I don't see this happening so quickly, as they all just spent big money on 1080p production facilities and would have to at least quadruple their storage capacities if not more.

Topics: Networking, Broadband, Fiber

About

Jason Perlow, Sr. Technology Editor at ZDNet is a technologist with over two decades of experience with integrating large heterogeneous multi-vendor computing environments in Fortune 500 companies. Jason is currently a Partner Technology Strategist with Microsoft Corp. His expressed views do not necessarily represent those of his employer... Full Bio

zdnet_core.socialButton.googleLabel Contact Disclosure

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Related Stories

The best of ZDNet, delivered

You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
Subscription failed.