X
Home & Office

4K UHD TV needs big pipes, not a pipe dream

The challenges in delivering extremely high-definition content over the Internet is not an issue of display cost, it's the limitations of our existing broadband infrastructure.
Written by Jason Perlow, Senior Contributing Writer

Unless you've been living under a rock--or have been blissfully unaware of the goings on at this last CES, you'd know that the products at the front and center of every major consumer electronics manufacturer (save Apple) have been the latest and greatest 4K, ultra-high definition (UHD) TV sets. And right now, they are crazy expensive, starting at around $25,000 each.

4k-vortex

I'm not here to argue that people won't own 4K UHD sets, or that we won't see 4K being used in computer displays or even tablets in the near future. In fact, I'm absolutely certain that the prices of these things are going to come down and become commodities and low margin products--just like the current generation of 1080p HD displays are today.

In three years or less, I won't even be surprised to see 4K screens on a full-sized iPad, a 10-inch Android device, or even ultrabooks and Windows RT tablets.

My issue is not the price of the screens themselves, it's how the content that will be delivered to the displays will be achieved. And right now, if you examine the state of consumer broadband in the United States, most households are barely able to stream 720p movies reliably, let alone 1080p, which is Blu-ray quality.

To move 4K movies across the Internet, we're going to need to move bureaucratic mountains at the state and municipal government level to get gigabit connectivity to the last mile in every major metropolitan area, unless we are prepared to distribute content on 128GB high-speed flash drives at Walmart or figure out how to free up broadcast spectrum that doesn't exist.

But the 4K technology is so much better than what we have now, right? Well, let's think about that for a bit.

Fundamentally I do not believe there are huge problems with today's TVs themselves, if you look at the entry-level and middle-market end of the scale, where the majority of units are being sold.

Since the digital transition during 2008-09 consumers have enjoyed a very high level of content quality overall compared to what existed before, and I do not think most of us would choose to go back to analog SD.

Now that being said, we are severely under-utilizing the capabilities of the current installed base of HDTVs.

All of this has to do with the fact that to get the majority of that content distributed, we have to leverage the limitations of the existing broadband Internet infrastructure that is lagging far behind the capabilities of our existing content-playback devices.

Very few people have been using the 1080p capabilities of their TV sets and set-top boxes because their content suppliers restrict much of this to pay-per-view on demand, and to really take advantage of quality 1080p content you have to use Blu-ray discs because nothing is over-the-air broadcast in this resolution.

Also read: Why 4K UHD Television is nothing but a CES wet dream

Companies like DirecTV are just starting to think about dedicating channels for 1080p broadcasts. It's not even on the radar for the broadcast networks in the United States.

Many households still do not have Blu-ray players to leverage their 1080p capabilities on their TV sets, nor is it as convenient a medium as Internet streaming from a device like an Apple TV, a Roku, or from embedded Netflix capabilities in "smart" TVs.

And while we are on the issue of smart TV, I'd like to point out that nobody cares about smart TV. Yes, people will expect that these features will be embedded in their TV sets, but nobody wants to pay extra for them and there is no standardized interactive TV content that anyone cares about. Everyone uses different content providers to get their subscriber material and they all have different UIs.

People just want to watch their shows, period, not interact with them. If any of that activity is going to occur it will be on mobile devices like tablets and smartphones that will simply replace the remote controls for DVRs and other set-top boxes that are in use now.

Let's get back to the issue of content delivery and image quality on existing sets using broadband-based content distribution.

Internet-distributed 1080p and even 720p content has to be heavily compressed in order to be pushed by on-demand services such as Netflix, Amazon Video, and Apple TV, so the amount of visible artifacts during playback is significant and is qualitatively not comparable to Blu-ray discs.

Most of these problems can be attributed to the broadband connection to the last mile and network congestion when attempting to access streamed video from the content-delivery networks (CDNs) that services like Netflix and Amazon use.

We will need gigabit or higher broadband to the home to make 4K content transport viable in the Internet, and obviously the electromagnetic spectrum cannot be expanded, so we are going to need to make massive improvements in digital multiplexing on existing DTV channels, freeing up existing spectrum and considerably advancing compression technology to even think about moving 4K over the air.

But it is not just the home broadband that needs to be beefed up in order to accommodate the much larger data streams. How will higher resolution impact content creators and infrastructure providers?

The content creators are going to need extremely powerful workstations and server farms to process the data. Take a look at what Weta Digital, the studio that produced The Hobbit uses. That should give you some idea.

Every uncompressed frame of data is going to be around 45 megs apiece, and then you are going to need serious compute power to do the compression and create the digital work prints, never mind having 100GB networks in your datacenter and 10GB to the workstation to move data around.

That's the kind of infrastructure TV studios are going to have to buy if network television and cable TV premium content providers have to get into this game. The storage and network companies like Cisco, EMC, NetApp, IBM, and HP are also going to get rich beyond their wildest dreams if this technology enters wide adoption.

I don't see this happening so quickly, as they all just spent big money on 1080p production facilities and would have to at least quadruple their storage capacities if not more.

You can argue that this technology is going to get cheaper, and the recording and production technology is going to become more portable, but it's still going to be a very large expense if you multiply it at scale.

Service providers, like the content creators, are also going to have to beef up their networks, and the Internet-facing switch infrastructure capability at the tier 1, 2, and 3 companies will have to be increased at a magnitude on the order of 10 times or even more to deal with this.

As it is today, the Internet is already overloaded with video streaming, and this would only compound the problem.

But is there a solution to the increased data volume required by 4K and even 8K? My opponent in our most recent Great Debate has stated that advanced XAVC recording systems from Sony and the codecs from HEVC will make it viable for content broadcasting and video streaming, and possibly even a distributed 4K physical media format, like we have Blu-ray for 1080p today.

The problem is that for the rest of us, even when the prices of these displays come down to commodity levels (and they will sooner rather than later) our broadband infrastructure, the TV production studios, and our frequency spectrum that is allocated for digital TV broadcasting are nowhere near being ready to accommodate 4K UHD let alone the 8K UHD that will almost certainly replace it in less than a 10-year timeframe, and will have even more serious bandwidth and data-moving demands.

Indeed HEVC and Sony's new recording formats will make the data streams and the files smaller, but even at (a highly optimistic) 100 megabits per second, that's a good 10 to 20 times larger than what most American homes can reliably transport from content distribution networks today, which is about 6-10 megabits per second.

Even if you have a 20-, 50-, or 100-megabit broadband connection at home today, the network contention and flow control at the CDN's will prevent you from moving data from Netflix, Amazon, and Apple that fast. The head-end equipment at your ISP and their peering connection to the CDN is just not beefy enough to move that much video.

Ever tried to watch a 720p or 1080p movie on Netflix on a Friday or Saturday night despite having fast broadband in your home? Then you know what I am talking about. HEVC will almost certainly improve 1080p content distribution by making the data streams more compact for existing broadband customers, but it won't make a sizeable dent in the problem of having to move 4K or even 8K.

And yes, the XAVC recording methods Sony is introducing will make it easier to fit 4K content onto a Blu-ray disc or a flash drive to insert into a player device, but who is buying or renting physical media anymore?

If you thought the digital TV transition was like the government trying to move Mount Everest; think about it trying to move Olympus Mons instead for a broadcast 4K adoption.

The FCC has put out a request for all 50 US states to be ready to deploy Gigabit broadband in at least one selected community by 2016. That's nice, but when was the last time the FCC was able to achieve anything that ambitious in such a short timeframe? The FCC makes the United Nations look like an effective legislating organization by comparison.

To the FCC, I say good luck with that.

The previous DTV transition was stalled and took over 10 years to execute, and in some markets it's not even completed yet. We would have to completely re-map the existing DTV frequency and channel allocation not to mention introduce new standards for transmitter and tuner equipment to accommodate 4K over the air.

It's such a non-starter it's not even funny.

Our existing broadcast infrastructure will be in place for a long time. Enjoy your 720p and 1080i, folks, because it's staying.

Getting gigabit to residences for streaming and on-demand content is not like having to free up spectrum; it will require dealing with municipal governments and convincing communities to jackhammer streets and bring fiber and high-speed copper in to the home, or alternatively gigabit wireless, which will have its own unique challenges.

So yes, we'll see affordable 4K TVs and monitors and tablets within five years. Being able to distribute content to them? That's a whole 'nuther ballgame.

Still, there is a benefit to 4K resolution that makes all of this infrastructure improvement well worth it, right?

Well, from an entertainment standpoint, at least as it relates to visualization-intensive apps like video games, the higher the resolution you have, the more complex the modeling you can do and thus the more realistic rendering of objects and textures.

From a vertical market standpoint this would be a huge boon to data visualization and scientific and medical imaging. Font rendering would also be super-duper sharp.

However, it should be noted that these applications are not as dependent on broadband infrastructure because these things are being rendered on the fly, using vectors, mathematical algorithms, and bitmaps. GPUs will definitely need to be beefed up, without question.

But it is safe to assume that 4K will be adopted for these things first long before we see it in any broadcast form or for Internet content distribution.

I also think it is certainly possible that before it is widely adopted in broadcast, the 8K technology will arrive to replace it, which may render any efforts to improve existing broadcast and content-creation infrastructure moot.

There is also the question of whether 4K usage at home will negatively or positively impact the movie industry. I think it depends on whether you are looking at it from the perspective of the movie-theater companies or from the content producers themselves.

I think the movie theater is already in serious danger due to the home-theater experience. And since many of them are co-located with malls and are affected by declining retail traffic they need to find ever-increasing ways to attract customers (4K, 3D, high frame rates) when their home experience is more than "good enough" and ticket and concession prices are off the scale.

4K at home may compound the problem for the theater venues but I don't think it is as significant a variable as other technologies and factors that are hurting that industry.

Backwards compatibility with existing content on 4K sets is also a non-issue. We already know what 1080p and 720p material looks like when displayed in 4K. Sony uses pixel-quadrupling technology with their some of its Blu-ray players in order to play current-generation 1080p movies on its 4K TV sets. It looks fine, and is not susceptible to the same analog/DTV translation issues we dealt with playing SD content on HD.

So pixel "octupling" or "sextupling" upscaler technology will be simply built into any set-top device that has 4K output capability and has to play back legacy 1080i and 720p HDTV content.

What about 3D and high frame rate (HFR) content? I also don't see it moving the needle. Major feature films will continue to be produced in 3D as well as in HFR, but I don't see network television or premium broadcast content going in that direction (with the exception of limited pay-per-view usage) for a very, very long time, so there will always be a content gap.

So where are we going to end up with TVs in the future if the ultimate limiter is the state of our broadband?

I certainly do not see the family living-room TV going away anytime soon, but we are going to be seeing a lot more usage of personal viewing devices. The tablet will be used to stream more and more video data, and we may see them being used more with home TV "servers" that act as centralized DVRs and tuners for these portable playback devices for cached content.

The back end of the equation will require beefier CDNs and faster edge-of-the-network connections in order to service it, whether we end up with 4K content or not.

In summary: A vast increase in couch or bed potatoism.

Are 4K and 8K mountains too high to climb for today's broadband infrastructure? Talk back and let me know.

Editorial standards