802.11n - The consequences of abandoning the 5 GHz frontier

802.11n - The consequences of abandoning the 5 GHz frontier

Summary: When 802.11b first started getting popular in late 2000, no one imagined that it would still be the most dominant standard 6 years later and continue to dictate the design of the latest wireless LAN products because it is the lowest common denominator.

SHARE:
TOPICS: Wi-Fi
3

When 802.11b first started getting popular in late 2000, no one imagined that it would still be the most dominant standard 6 years later and continue to dictate the design of the latest wireless LAN products because it is the lowest common denominator.  As with any technology that is first to reach critical mass, it becomes so successful that it becomes impossible to dump it no matter how limiting its design.  That unfortunate design limit that haunts even the latest 802.11n draft standard devices is the choice of 2.4 GHz radio frequency band with only three non-interfering channels 1, 6, and 11.  But because 802.11n needs two channels to operate at full speed but can't because of legacy device considerations, 802.11n is capped at approximately half of its actual potential.

Note:  Japan allows channel 14 isn't truly non-colliding because it's too close to channel 11.  It does open the possibility of using channel 1, 8, 14, 4, 11, and then 1 again in that adjacent order for the possibility of using 5 unique channels that mostly don't interfere with each other.  Outside of Japan it is possible to use 1, 8, 4, 11, and then 1 again for the possibility of 4 mostly non-interfering channels.

802.11b was supposed to have given way to its sibling standard 802.11a which operated in the 5 GHz range with a wide-open 8 (potentially 24) channels that was rarely used by consumer devices like 2.4 GHz cordless phones or microwave oven leakage which dwarfs even the strongest signal from an 802.11 device.  It even operated at roughly 4 times the speed but it suffered the problem of not being widely available at affordable prices.  802.11b had achieved critical mass success by 2001 and by 2002 the industry was moving towards a backward compatible successor to 802.11b called 802.11g which operated 4 times faster like 802.11a.  The problem was that in order to maintain backward compatibility, 802.11g also had to operate in the limited 2.4 GHz space and worse it had to switch to 802.11b if even one legacy 802.11b device joins the party.  This meant that many 802.11g client devices connecting to an 802.11g access point would all have to switch to quarter-speed 802.11b mode if a fifth device showed up using 802.11b.  During this entire time, 802.11a only devices withered on the vine and rarely left the shelves.

Most large enterprises during the proliferation of 802.11g knew this spectrum limitation and went with the best of both worlds by purchasing dual-band devices that could support 802.11b, 802.11g, and 802.11a.  The access point devices had dual radios with one supporting 802.11b/g operation and the other dedicated to 802.11a operation which allowed a single access point to behave as two.  The client devices only needed a single radio that supported either 2.4 or 5 GHz operation but not at the same time.  This way enterprises like major schools and hospitals could use a dense deployment of 802.11a for computer networking while the less dense 802.11b spectrum was reserved for wireless VoIP phones.

On the consumer front, chipset makers like Atheros and their major consumer electronics partners like D-Link and Netgear would try to market dual-band devices for the high-end home networking gear market.  Like their dual-band enterprise cousins, the access points were essentially two access points in one which allowed the 802.11b/g wireless network to be used for computer networking and the 802.11a wireless network to be dedicated for interference-free high-definition video transmission applications.

But just as the dual-band market started hitting full stride because of lower prices due to more efficient design and manufacturing, it all came to a crashing halt in 2005 when the MIMO "pre-N" (implying 802.11n) craze hit.  MIMO which ultimately became the 802.11n draft standard had one critical marketing advantage that dual-band lacked which was double the throughput for a single session.  Never mind that it couldn't sustain those higher speeds under most normal circumstances due to frequency congestion in the 2.4 GHz band and never mind that it could only handle one data transmission at a time, speed is a simpler concept to market than simultaneous data transmission with less frequency interference.  The superior dual-band devices and the wide-open 5 GHz frontier were effectively hijacked by the good-on-paper but bad-in-reality MIMO devices that were prone to interference.  The one bright spot for MIMO devices were the improved range though that could have easily been solved with more powerful antennas on dual-band access points with stronger transmit power.

One might wonder why the MIMO devices couldn't have been dual-band and the answer to that was cost.  MIMO devices used multiple radios (2 or 3 radios) to transmit and receive over the same radio frequency which is a technique called spatial multiplexing to achieve its higher speeds.  Since it already required multiple radios for single band operation, adding an additional set of MIMO-capable radios on the access point would have increased the already-high prices.  The end result was that wireless LAN chipset and electronics companies jumped on the MIMO/802.11n bandwagon to be able sell high-speed higher margin devices to the consumers.  Meanwhile the enterprise market knew better and have completely ignored 802.11 draft craze and stuck to dual-band 802.11a/b/g devices.

So here we are today stuck in the same 2.4 GHz funk that 802.11b made popular and we don't have enough channels.  At CES 2007 there were some dual-band 802.11n devices that promise 2.4 and 5 GHz operation but they'll mostly be so expensive that they won't hit critical mass to make 5 GHz popular.  This means consumers will be suckered in to buying expensive 802.11n devices that don't deliver the speeds they were hoping for in real-world operation, don't handle two data streams in two different radio frequencies, and don't support interference-free 5 GHz operation.  Heck I'm so sick of this whole mess that I'm sticking with dual-band 802.11 a/b/g.

How could this mess been avoided?  802.11n in my opinion should have NEVER permitted 2.4 GHz operation in the first place and should have only used the 5 GHz band.  To maintain backward compatibility, an extremely inexpensive 2.4 GHz radio could have been used for 802.11 b/g compatibility in addition to the two or three radios used for 802.11n operating at 5 GHz on the access point.  The wireless client adapters would only need a single set of radios that could operate in 5 GHz 802.11n mode, 5 GHz 802.11a mode, 2.4 GHz 802.11g mode, or 2.4 GHz 802.11b mode for worst case compatibility.  Had this been the standard for 802.11n, it would have only added a tiny increase in cost to the access points and almost no cost to the client adapters.  Furthermore, 802.11n devices could avoid all the congestion in the 2.4 GHz band and operate full time with double-wide 40 MHz channels to deliver the true promise of MIMO not only in theory but in practice while legacy devices operate independently on a separate radio and frequency.  But even though the IEEE 802.11n standard doesn't mandate 5 GHz operation for MIMO devices, companies can still stop the 2.4 GHz madness and build the product that I'm suggesting.  I and the entire analyst industry will publicly thank you for it.  Make your voice count and vote below.

[poll id=12]

[Update 11:45 AM - My friend Tim Higgins at smallnetbuilder.com suggested that allowing 20 MHz only MIMO operation in 2.4 GHz wouldn't be such a bad thing since that doesn't cause excessive interference and can still offer significant speed improvements.  I had considered that when thinking about this standards mess and concluded that this would force the MIMO-class radio to come out of 5 GHz mode and support legacy pre-N devices at 2.4 GHz.  That would make 2.4 GHz MIMO operation the lowest common denominator again and we're right back to abandoning the 5 GHz spectrum.  We could certainly have two sets of MIMO-capable radios for 2.4 and 5 GHz operation but that would spike up the costs and the wireless access point makers would mostly avoid it or offer very few dual-band capable devices.  My whole point is that in order to force the migration to the wide-open 5 GHz band where there are 12 channels with even more being opened up, the only way to make this happen is to forget about 802.11n 2.4 GHz mode and any legacy pre-N devices can continue to work at 802.11g]

Topic: Wi-Fi

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

3 comments
Log in or register to join the discussion
  • RE: 802.11n - The consequences of abandoning the 5 GHz frontier

    good post.
    order Propecia
  • RE: 802.11n - The consequences of abandoning the 5 GHz frontier

    had a nice experience with <a href="http://www.iflexion.com/">Iflexion web programming</a>
    order Propecia
  • Cheers

    Great post... with all the waffle clueless people spout in blogs these days it's refreshing to read proper articles like this. Very interesting.
    JeProject