X
More Topics

Why 802.11n is a hard act to swallow

Current wireless data transmission techniques are coming up against fundamental physical limits. We explain why, and show how the upcoming 802.11n Wi-Fi standard fits into the overall picture.
Written by Rupert Goodwins, Contributor

Sometimes, propaganda works rather too well. As citizens living in the rule of Gordon Moore's famous law, technology consumers have been brought up to expect regular boosts in speed delivered hand-in-hand with lower power and more convenience. Physics will not always comply, however. In the case of 802.11n Wi-Fi, designers are coming up against fundamental limitations in the way wireless works -- limitations that the marketing departments are keen to gloss over.

Until now, the amount of data you can pack into a radio signal or carrier has been limited by the cleverness of the modulation scheme -- the way the amplitude and frequency of that signal is changed to reflect the data it's carrying. At the beginning of the 20th century, the very first wireless transmissions were data -- morse code -- imposed at around two bits per second on an extremely wideband carrier, effectively white noise from a spark gap. A hundred years later, the original 802.11 modulated a megabit per second on a highly pure microwave signal by shifting its phase between two states. Although remarkable for its precision, bandwidth and low cost, this was a direct descendant of techniques developed for wartime teletype systems.

802.11 b and 802.11a/g
There have been two generations since 802.11: 802.11b with a maximum data transfer rate of 11Mbps and 802.11a/g, each with a maximum data transfer rate of 54Mbps. The 802.11b standard layered multiple techniques on top of each other -- a basic transmission standard where the carrier was shifted between four different phases, with the data further encoded into patterns that are particularly easy to differentiate from each other and distinguish from noise. The 5GHz 802.11a and 2.4GHz 802.11g standards use techniques where the radio spectrum is divided up into multiple parallel transmission channels, with particular care taken to avoid interference between adjacent channels and to divide the data stream across the channels so that localised interference causes as little damage as possible. Within this framework, four different modulation schemes can be used -- the better the signal, the faster the scheme.

These latter techniques approach the theoretical limits for data across wireless: if you have a radio channel so many megahertz wide, there is simply no way to put more data through it. You can produce a faster network, but there are only two ways forward: more channels or wider channels.

802.11n: more channels and wider channels
The 540Mbps 802.11n standard will use both ideas. As it's not possible to fit more channels side by side into the allocated spectrum on the two internationally allocated bands at 2.4GHz and 5GHz, 802.11n overlays multiple channels on the same frequencies using multiple transmitters and receivers on separate antennas. Called MIMO (Multiple Input, Multiple Output), this works by using the slight differences in distance between the physical paths between pairs of transmitting and receiving antennas to differentiate between the signals. Once the network has calculated what these differences are, it can mathematically untangle the combined signals from each channel even though they share a frequency. Theoretically, each combination can carry a full load of data, so with two transmission and two reception antennas there are four spatial channels.

belkin-mimo-330x248.jpg
MIMO equipment such as Belkin's N1 Draft-N router overlays multiple channels on the same frequencies using multiple transmitters and receivers on separate antennas, using slight differences in the physical paths between pairs of antennas to distinguish between signals.

The other major difference is to increase the size of each channel: instead of the 20MHz channels used at the moment, 802.11n can use 40MHz to further double the throughput. Here, the physics is uncompromising: if each channel is twice as wide there can only be half as many in a particular band. That has significant implications for existing users of the bands: there are far fewer places to flee.

The original 802.11 standard had around a dozen channels in the 2.4GHz band, with the exact number varying according to country. The 802.11g standard cut this down to effectively three -- although you can still set your access point to use a channel between 1 and 12, the channels are so broad that significant interference occurs unless you put physically adjacent transmitters on channels 1, 6 and 12. Sorting out what happens when you switch to an 802.11n style of transmitter, where even two channels in the band can mutually interfere, is one of the major problems of the standard -- and one of the immediate practical issues of pre-N or draft-N equipment currently being shipped.

It's worth noting that MIMO is not in itself nearly as unfriendly towards existing band users, although it does create much higher levels of energy on a frequency which can adversely affect receivers on adjacent channels. Many 802.11b/g wireless adapters are not well designed to reject adjacent channel interference, which raises insuperable problems of responsibility if a previously effective network starts misbehaving when a new neighbour appears on the band while sticking to the rules. The problems aren't insuperable, but while the standards committee tries to work towards a mutually acceptable set of compromises the market is pushing ahead regardless.

The future: intelligent wireless
The long-term solution is cognitive radio, a form of wireless communism where each network node negotiates with others in the area to take what it needs and give what it can. That is the subject of intensive technical, practical and political research, with battle lines being drawn between existing entrenched interests, revolutionary-minded inventors and the regulators. The 802.11n family of products will be the last major revision using existing ideas, and it is perhaps not surprising that it is resulting in the most intense examples of conflicts of interest.

The victims, as always, will be the population on the ground. It remains to be seen whether the wireless network marketplace will turn into an arms race, where each individual installation tries to get the most powerful and effective system regardless of the effect on its neighbours, or whether sanity, restraint and co-operation will prevail. Future historians of technology will see this period as one of fascinating change; the rest of us are cursed to live in 'N-teresting' times.


Editorial standards