Despite interminable delays to the forthcoming 802.11n wireless networking standard, there's a consensus it will finally materialise in March 2009. Meantime, it's being touted as a fix-all for Wi-Fi's various shortcomings.
The standard promises to improve range and boost traffic throughput, from the existing 54Mbps (megabits per second) offered by 802.11g-based technology, to between 100Mbps and 300Mbps, depending on a range of factors.
The performance improvement takes place because, while existing 802.11b- and g-based products are only able to use a single 20MHz channel in the unlicensed 2.4GHz radio spectrum, 802.11n products can use either one or two channels; plus, says Mark Main, a senior analyst at Ovum, there is "a bit of space between the channels to increase the available radio spectrum".
"In the 2.4GHz spectrum, there's a maximum of 14 overlapping individual channels in most countries. Currently, any particular radio system can only use one. This has been one of the fundamental limitations to speed in wireless Lan systems, but 802.11n allows the channels to be bonded together. The idea is that, like water, you can get more through two pipes than just one and so this makes throughput quicker," Main explains.
Another advantage of the new standard relates to improvements in traffic handling, as it incorporates new scheduling mechanisms harvested from another 802.11 standard: namely e, which was ratified by the IEEE in late 2005. These mechanisms include the Power Save Multi-poll, which helps improve quality of service for such isochronous (meaning simultaneous) transmissions as voice.
Including support for multiple-input multiple-output (MIMO) technology also provides a benefit. "11n products use MIMO technology to improve reception, which helps both with the ability of systems to overcome interference and improves speed by using multiple antennae, which is like using two ears instead of one," says Main. MIMO effectively creates as many links as there are combinations of receive and transmit antennas, so with three transmission and three reception antennas there are nine separate transmission paths, each capable of reusing the same frequencies as its fellows.
But, as ever with these things, the situation is not quite as straightforward as it seems. While the idea is that 802.11n-based products will be able to operate within two radio spectrums — 2.4GHz and 5GHz, the latter of which is currently also used by 802.11a-based offerings, although adoption here is low — most vendors, including Aruba Networks, Broadcom and Intel, have to date focused on designing products for 2.4GHz.
5GHz is a lot cleaner; it has only recently had its licensing restrictions relaxed, so is free of the cacophony of unlicensed equipment such as Bluetooth devices and microwave ovens that have infested 2.4GHz over the years. The problem is that 5GHz has a much shorter range. This means that performance traditionally has not been as good, because its higher-frequency radio waves tend to be absorbed by walls, people and the like more easily.
But in the new 3G world, there is likely to be an issue with the more widely used 2.4GHz spectrum. "Because 802.11n uses multiple radio channels, if it detects other radio systems, it's supposed to back off from being greedy because of the quest to be backwards-compatible. So if you have n systems in the presence of b or g systems, the new one will detect the presence of the old and back off from being a bad neighbour," explains Main.
This means that unless organisations either deploy n-based equipment in a greenfield site or are prepared to replace all of their existing access points with new ones, "it's very probable that 802.11n won't perform up to headline speeds" in the 2.4GHz spectrum.
Real world throughput for existing 802.11 products is currently about half theoretical speeds, with 11g-based technology, for example, generally providing 20 to 22Mbps rather than the promised 54Mbps. But a mix of such kit with 11n-based products is likely to result in performance being hit by as much as a factor of four, as the latter defaults to the lowest common denominator offering in the vicinity.
"All the headline stuff about speed has to be taken with a pinch of salt. If you use b and g now, there'll be little improvement. Just because technology is theoretically capable of going faster doesn't mean to say that the user experience will match that, although it's most likely to happen in an environment where the situation is controlled," Main warns. As a result, he recommends operating equipment in the 5GHz spectrum.
Another area of concern, however, relates to the fact that, although 802.11n provides wider and more robust coverage than previous generations of wireless standards, this coverage is more variable due to...
...the use of MIMO techniques because they use multipath, an RF phenomenon where the same signal arrives at two different times due to reflections combining with a line-of-sight signal.
Ken Dulaney, a vice president at Gartner, in his July 2007 report Key Challenges Arise for 802.11n Deployments, describes multipath as "an unpredictable by-product of RF transmission" that is highly dependent on environmental factors such as the presence of walls.
Therefore, he indicates that "while 802.11n provides a great deal of flexibility and robustness", it also introduces complexity into the process of "predicting how each unit will perform". This means that layout will have to be considered carefully to optimise performance and reduce interference.
Another factor to bear in mind on the technical side is that, although the Wi-Fi Alliance is now certifying 802.11n products based on Draft 2.0 of the standard, there are no guarantees that another year or so's work on the standard will not lead to more alterations, which means that firmware in current purchases may need to be upgraded in the future.
There are also challenges on the business side of the equation, not least in terms of justifying the cost of upgrading. For example, says Craig Mathias, principle analyst at the Fairpoint Group, prices for enterprise access points will initially be about twice that of older kit, although he expects them to fall quite rapidly, while mobile devices such as PDAs are also likely to be more costly.
"If people want better performance from their 11n-based systems, they're almost certainly going to have to use products operating at 5GHz. But because devices will probably be dual-mode so that they can work at 2.4GHZ too, they're also likely to be more expensive because they've got more technology in there," explains Main.
This means organisations will need to consider the legacy and business case for swapping out existing equipment, which may lead some to choose to stay with b and g if it's adequate for their needs.
Despite all of this, however, Mathias is convinced that 802.11n is the way forward and that it will be the primary Wi-Fi technology within five years. "11n is the last barrier to broad enterprise adoption of wireless Lans," he says. "A lot of people said that wireless is great, but the technology is changing rapidly, which led to fears that whatever they bought would be obsolete too quickly. But with n, the last serious objection goes away."
Taking the plunge
Because he himself does not expect to see any major changes to the draft standard over the next 18 months, Mathias recommends that organisations jump right in and take advantage of improvements in reliability, capacity, throughput and rate versus range performance. This is not least because he believes that 11n-based equipment will become the dominant wireless technology for running all types of applications and, this means that by the first quarter of 2009 when the standard is released, "people won't buy anything else".
Dulaney is more cautious. In his report, he recommends that organisations feel free to install Draft 2.0 clients in new notebook computers now in order to improve their performance with existing wireless Lan infrastructure — although he points out that other types of device may not initially support the draft standard.
At the same time, Dulaney warns that enterprises with earlier versions of wireless Lan infrastructure equipment, such as access points, should not migrate those to 802.11n until the final specification is ratified and Wi-Fi certification is provided — and even then, they "should consider it a complete refresh of equipment rather than a gradual upgrade". The same does not apply to greenfield sites, however, which should evaluate whether to deploy Draft 2.0 equipment in lieu of existing alternatives.
Although Dulaney's colleague Phillip Redman, a research vice president at Gartner, indicates that "there isn't much demand in most industries" for 802.11n technology at the moment, over the next three to seven years, he expects to see it replacing existing wireless Lan technology, which will help boost market growth.
While the 2007 global forecast in this area anticipates that expenditure will hit $1,700m (£858m) by 2010, this figure is predicted to rise to $3,000m (£1,514m), with industries such as healthcare, warehousing and manufacturing experiencing the "deepest uses".
But Rob Bamforth, a service director at Quocirca, is not so sure. He points out that nothing can ever be certain in today's complex technological world, as was evidenced by the VHS/Betamax wars in the video space of the 1980s — and that Wi-Fi is no longer the only game in town.
"Technology always gets smaller, faster and cheaper, but the big question is whether people need it. So as they increasingly put 3G data cards into their laptops and have the option of putting pico and femtocells around their sites, the question is which technology will be best suited to their needs. It all comes down to the business case in the end," he concludes.