X
Home & Office

Setting radio free

Adding new services to the radio spectrum is a complex but necessary task as wireless technology becomes increasingly important
Written by Rupert Goodwins, Contributor

The Open Future for Wireless Communications conference in Cambridge, organised by the Communications Innovation Institute in conjunction with the Cambridge-MIT Institute, saw technologists, regulators, researchers, network operators, equipment and chip manufacturers get together to discuss the future of radio regulation.

The event, held late last month, included a discussion on ways to add new services to the radio spectrum. One was the freeing up of existing bands — spectrum re-farming — by making existing services use more efficient technology and reallocating those frequencies which are either lightly or no longer used. Another involves the creation of underlay networks — wideband services that run alongside existing occupants without mutual interference. Finally, overlay technology takes existing networks and combines them behind a common interface — improving reliability and quality of service for the users to the point where new services can be introduced.

All of the above options require changes in the way radio is regulated. To date, spectrum re-farming has been the most popular option as it can be done piecemeal and reduces the number of people affected at any one time. For example, there has been a move from analogue to more efficient digital services with mobile phones, public service users such as police, and broadcast radio and television.

And there's no doubt that more can be done here. Professor William Webb, head of research and development at Ofcom, said that "Nobody thinks that they have unused spectrum, but we do". A spectrum survey at three sites, Heathrow, Central London and Cambridge, showed that half the spectrum between 100MHz and 1 GHz was unoccupied, he reported — "a pessimistic figure, but not too far off". The unused portion was mostly military. "There's a deep suspicion that there's more than enough spectrum for everyone."

Underlay networks are the most technically precocious option, and involve massively spread spectrum technologies like Ultrawideband (UWB) or opportunistic adaptive systems like cognitive radios that sense the spectrum around them and seek out unused portions. Here, the question is how much underlay networks will interfere with existing services. Simon Pike, chief engineer, Regulatory and Spectrum at Vodafone, was cautiously pessimistic. "UWB interference depends greatly on the type and location of the systems affected," he says," but it's unlikely to be avoided completely. "UWB could prevent the main licensed user of a band from improving the spectral efficiency of his system, and could restrict future developments that could have even greater benefits", he said. He was also unimpressed with cognitive radios which can sense their environment and location and then alter power, frequency, modulation in order to dynamically reuse available spectrum. "Cognitive radios can only identify transmitters, not receivers, so you get the hidden terminal problem [where one side of a link is inaudible to a sensor because it's too far away or shielded]. They're also potentially counter to liberalisation, as cognitive radios can only respond to signals they know about. Once a cognitive system is deployed, it's hard to change the use of a band to something those radios don't recognise".

Overlay networks have the advantage that they use existing systems. While regulatory changes will be required to make best use of them — licensees are very constrained in what they can do with their systems, regardless of changes this will make in the actual spectrum used — these may have minimal impact on other users. Cambridge researchers have been working on PROTON — a policy-based system for roaming transparently over overlaid networks. This tackles the problem of how a connection can seamlessly switch between networks with different characteristics — cellular, LAN, wireless WAN — ideally, aggregating as many as are available to maximise throughput and quality of service. The problem boils down to three distinct areas — deciding what to do, doing it and changing behaviour as the environment changes.

It's not good enough to have hard-wired rules such as mobile phones use to change between cells, switching to the strongest signal or picking the highest bandwidth; instead, the nodes on the network must analyse their surroundings over time and decide how to connect based on a policy of what works best under various conditions. So, a connection on GPRS that detects it's moving into a Wi-Fi zone might decide to negotiate a connection ahead of time, because it knows that switching over at the last minute involves a break of several seconds. Cambridge has a live wireless IPv6-based testbed for PROTON, using a combination of GPRS and 802.11b and whatever else they can plug in. Whatever technologies go together to make the fourth generation of mobile services — 4G, if you will — the thinking runs, there'll be many different networks in many different combinations; the big challenge will be to make them work together without any input from the user.

Regulators face an impossible task. For any territory, there are a range of incumbents already conducting business in spectrum they consider theirs — in some cases, having paid very large amounts of money for the privilege. There are also newcomers itching to get going, who consider it unfair that they are denied access to the airwaves because they have technologies that don't fit into the old idea that services had to be restricted to one band and one way of doing things. The incumbents have legitimate technical concerns that the new services might interfere with their existing expensive infrastructure and mess up their business models. They also have commercial concerns that the newcomers might be more efficient and cheaper, in ways that the incumbents would be prevented from emulating due to the terms of their licence. Also, they're happy to see competitors strangled at birth — as all companies are. That translates to an understandable willingness to paint the bleakest picture of the consequences of allowing new and less-regulated services.

These are only some of the conflicts that Ofcom has to judge. The debate is taking place in national, European and global contexts, and at each level different interests have to be understood and factored in. Radio waves don't stop at borders, and in a world where millions of people move between territories clutching mobile phones, wireless-enabled laptops and whatever comes next, the transmitters don't stay at home either.

"There are three ways to manage spectrum: command and control — the old way, market forces, and unregulated access for all," says Ofcom's Webb. His high level message is that licensed systems managed through spectrum usage rights provide a good level of technology neutrality, letting people chose the most appropriate way to provide a particular service. The real issues are where unlicensed systems want access to licensed spectrum.

"Can the market work it out?" he says. "With cognitive radio, this is possible. The cognitive radio people can talk to existing licence holders and sort things out between them." He doesn't believe it is possible for UWB. "We are examining the cost-benefit equation and our statutory duties for UWB. There are very many licence holders between 3 and 10 GHz, and UWB will have lots of uncoordinated users. The market can't fix that." One of question here is whether the size of the market created by UWB and the benefits it gives will be larger than any putative problems it causes to the cellular radio operators — who may have to install more base stations to overcome the extra noise caused by a sea of UWB devices. "We feel our remit lies more towards flexibility than certainty." says Webb.

Phillipe Lefebvre, of the European Commission's directorate general for information society and media's radio spectrum policy unit, claims that the European approach had the objectives of innovation, competitiveness versus other markets, balance between commercial and public interests and to reduce overall spectrum scarcity. He outlines two management models, starting with spectrum trading, where people are free to buy and sell bands. This is very advanced in the UK, Norway and Sweden, he says. For that to work and to pre-empt future fragmentation across Europe, he wants to see a common definition of tradable rights and a convergence of trading in suitable bands.

The second model is the collective use of spectrum as a commons — and to this end, his group was launching a tender to better understand the approach. This will include underlay use and sharing of licenced spectrum but, like Ofcom, the emphasis is on further research to better understand the real life implications of decisions.

Even where a particular regulatory approach has proved a remarkable success, it's hard to draw lessons. As ex-FCC legislator Michael Marcus points out, "GSM was the triumph of the old European regulators, a standard that was rigorously defined from top to bottom and became a worldwide success. Wi-Fi was the triumph of the American laissez-faire approach, a standard that evolved in a spectrum that was almost completely unrestricted about what you could do and how, and it too became a worldwide success. Both approaches can work, both can fail."

The lesson of the conference is that there are no quick solutions, to expanding the use of wireless for new services, that come without a price. New technologies can appear far quicker than the regulators can sensibly respond to them, especially when they conflict with existing or other new services. The move is towards liberalisation, and the regulators are sensible of the importance of not letting outdated legal concepts of radio act as too much of a brake on innovation. But nothing is too outré to be ignored: expect a pragmatic approach to spectrum liberalisation, biased where possible to allowing new ideas to have their chance in the marketplace — and on the air.

Editorial standards