For the last couple of weeks, I have been testing the latest 802.11ac Wi-Fi equipment from several networking vendors, including routers and range extenders.
Many of these new routers are expensive, at least by home networking standards, as they retail in the $200 and above range.
However, the performance that can be extracted from them can be significant, especially if you have client devices that can communicate at native speeds, using both 2.4Ghz and 5Ghz networks simultaneously, such as adevice.
Although wireless gigabit speeds are possible with these new devices, you also have to take into consideration the speed of your broadband connection, the end-to-end capabilities of the devices on your LAN that are streaming to your clients, and of course, the distance from the Wi-Fi transceiver(s) to the endpoint, as well as any potential wall(s)/obstruction(s) and channel interference.
Most Wi-Fi devices shipping today support 802.11n or at a bare minimum, 802.11g. 802.11n communicates over the 2.4Ghz and 5Ghz bands, whereas 802.11g, which is an older standard, uses the 2.4Ghz band exclusively.
The wireless performance of your Wi-Fi enabled device (and your router) is also going to depend on how many antennas it has and how many spatial streams it supports (MIMO).
802.11n "draft" devices began shipping in 2007, while second-generation 802.11n Wi-Fi routers have been shipping since 2009. The specification provides for either 20Mhz or 40Mhz channels. The 40Mhz channel width permits data transmission at approximately twice the rate of 20Mhz.
Most, if not all Wi-Fi routers currently on the market, be it the residential gateway equipment your broadband provider installs on your premises, a 2.4Ghz budget model you pick up at Wal-Mart or the most expensive dual-band 802.11ac for high-definition video streaming and gaming applications that have multiple antennas are configured the same out of the box.
They are all set to "auto" for channel width and "mixed" or "hybrid" on both the 2.4Ghz and 5Ghz bands so that they attempt to negotiate with the client at the fastest data rate possible, and also provide the highest level of backward-compatibility.
However, you can further optimize your 802.11n and 802.11ac performance if you set the network mode of 2.4Ghz and 5Ghz bands to "802.11n-only" and 40Mhz channel width rather than "mixed" and "auto", although that will render older 802.11g devices unable to connect, if you still have any of those lying around. I do not, so I set my networks up that way.
All the mobile platforms -- save for one -- will negotiate at the higher data rate on the 2.4Ghz frequency when the router is set to "802.11n-only" and 40Mhz channels.
Take a wild guess which platform that is. I'll give you a hint, it's not Windows, Windows Phone, any version of Linux embedded into an IoT device or even Android.
In almost every single piece of router equipment I have tested for my own personal use or configured for friends and family, if you set the 2.4Ghz band set to "N-only" and at 40Mhz channel width, your iOS device will simply fail to connect to the wireless network.
I don't care who makes the router in question, whether it is Linksys, Netgear, Asus, D-Link or some other random Chinese or Taiwanese company. It is always the same result.
Apple's iOS, which runs on iPhone, iPad and Apple TV cannot connect reliably to a 2.4Ghz network with a 40Mhz channel, period. Their own documentation says so and it's not recommended for Mac OS X either.
In a large number of cases, I have actually had to force the 2.4Ghz network to 20Mhz channels, rather than use "auto" in order to make wireless networks work with iOS.
That degrades the data rate of any device communicating over that band, be it your nice new laptop you bought with the extended range, fast 802.11ac Wi-Fi built-in or your latest-generation Android or Windows Phone smartphone.
So that $200+ 802.11ac router you just bought? Yeah, not performing optimally now because you own an iPad Air or an iPhone 5s, or the latest-gen Apple TV.
Now, as a small consolation, iOS devices can communicate using both 20Mhz and 40Mhz channels on the 5Ghz band. So just use Apple devices on 5Ghz, right?
Well, no. The reason why dual-bands exist on 802.11n and 802.11ac is because 2.4Ghz and 5Ghz frequencies have different performance characteristics.
The 2.4Ghz band, while subject to greater interference, has greater range than that of the 5Ghz band, although the bandwidth (data rate) of the 5Ghz band is approximately double that of 2.4Ghz band.
So if your priority is range rather than bandwidth, unless you live in a single bedroom apartment, with few interfering walls, you're kinda screwed if you decide to use the 5Ghz band exclusively with your iOS devices.
There's a workaround, which I ended up resorting to. You could buy a cheap access point for the iOS devices, setting it to 20Mhz on the 2.4Ghz band, with its own unique SSID and isolated to a specific channel number so you can identify it easily, while setting your nice expensive router to 802.11n-only with 40Mhz (and 80Mhz) channels on the respective bands.
Or, maybe Apple could just fix their Wi-Fi implementation.
Does Apple's stone-age Wi-Fi implementation throw you into fits of rage? Talk back and let me know.