Wireless is a unique technology. Despite being over a century old, it continues to improve at an ever-increasing rate. Yet all the past, present and future improvements stem from one underlying process: better engineering leading to more precise use of space and time.
A set of new technologies are just coming into early use: 802.11ac promises a Gigabit per second from a single access point; LTE-A is cutting out a path out to full mobile broadband integrated with direct local device-to-device communications; and smart spectrum reuse is easing the bandwidth crunch. Further out, the promise of terabit systems combines with innovative reuse of existing ideas to provide more services further afield than ever before.
The first wireless signals to see practical use were Morse code broadband blasts of spark-generated noise, interfering with any other signal within range. The invention of tuning allowed multiple signals to share the spectrum; better antennas meant the same frequencies could be reused without mutual interference; amplitude and frequency modulation meant more information could be carried on each signal.
All modern techniques such as 802.11ac, LTE and 60GHz rely on combining multiple channels across frequencies and spatial paths; processing multiple channels at once is an ideal task for today's massively-parallel giga-transistor chip architectures
The biggest break for wireless, as for all electronics, was the triumph of the transistor over the valve. From the 50s onwards. Moore's Law gave, and continues to give, engineers the ability to do more and more work on signals with lower power requirements and less cost: Wireless is doing particularly well from that process. All modern techniques such as 802.11ac, LTE and 60GHz rely on combining multiple channels across frequencies and spatial paths; processing multiple channels at once is an ideal task for today's massively-parallel giga-transistor chip architectures.
This also ties in well with SDR, or Software-Defined Radio, which relies on very fast processors to replicate mathematically the sort of signal processing that dedicated electronic circuits used to do. This means that a single SDR can manage multiple standards just by changing its programming, leading to some predictions that one chip could do the three main classes of wireless — PAN, LAN and WAN (Personal, Local and Wide-area Networking). As yet, however, this remains economically less preferable than having some separation of services.
The next generation of general-purpose wireless LAN is 802.11ac, which is due to be finally approved in 2014. It builds on ideas that saw first deployment in its predecessor 802.11n, which introduced MIMO (Multiple In, Multiple Out) to the mass market. By running multiple transmitters and receivers on the same channel but to multiple antennas, MIMO makes use of the tiny timing differences in the paths between each transmitter/receiver combination of to create parallel channels.
The 802.11n standard specified up to four parallel spatial channels, with individual channels set to a maximum of 40MHz bandwidth; 802.11ac increases that to eight parallel channels of 80MHz minimum, 160MHz optional. It also uses slightly more efficient ways to code the data onto the transmission channel: however, these are so close to the theoretical maximum — the Shannon Limit — that future improvements will have to come from wider channels and more of them.
One place where that will be happening is on 60GHz, which is the third mainstream band to be made available to Wi-Fi after the original 802.11b 2.4GHz and 802.11a/n/ac 5GHz. Although the exact frequency allocation varies from country to country, the recommended standard has four channels, each with 2.16GHz bandwidth. In place of MIMO, spatial channels are created with beamforming or AAS — Adaptive Antenna Steering. Individual antennas at 60GHz are very small. barely a couple of millimetres long, so densely populated arrays can be easily built and configured to create dynamic, tight beams that track moving devices. Known as 802.11ad and promoted by the Wireless Gigabit Alliance (which is due to merge with the Wi-Fi Alliance), 60GHz Wi-Fi is specified to provide 7Gbps, although only at ranges of up to 10 metres and not through walls or windows. There, the standard will fall back to 802.11ac or slower.
Currently, the highest frequencies in use experimentally are in the range of 240GHz, where the Fraunhofer Institute and other German researchers on the Millilink project have squeezed 40Gbps across a distance of a kilometre. They expect to use multiple channel technologies to get that up to the terabit range, making the technology suitable to replace fibre-optic links and provide last-mile connections to homes and offices. However, unlike fibre optic links, it does go downhill very fast in heavy rain showers.
The world is gradually getting to grips with the first round of LTE, which can provide up to around 100Mbps using close to twenty different frequency bands around the globe. Although widely sold as '4G', it's not quite there: LTE-Advanced, or LTE-A, is the umbrella of ideas that are intended to provide the proper fourth generation of mobile broadband.
Such ideas include parallels with those used in 802.11ac, with eight-way MIMO, ranging up and down different bands as appropriate, automatic configuration and bandwidth management, and advanced coding methods. A single full-spec base station will be able to provide upwards of 10Gbps, divided into different sectors; LTE-A has also absorbed the bones of the failed first-generation WiMAX point-to-point standard.
The first LTE-A networks are being deployed, but not with the full spectrum of advances being developed. Some other ideas are key to making LTE-A work in new ways to existing mobile infrastructures.
The classic model of cellular wireless, with a relatively small number of high-capacity masts, doesn't scale up well to high densities of users consuming large amounts of bandwidth.
The expected solution is small cells — the generic term for transmitters coping with from five to around 250 users. The smallest variant, the femtocell, is currently deployed to fill in coverage for families within a domestic household, with picocells intended for offices and mini- or metro-cells extending to campuses. Backhaul — the connection from the cell to the rest of the telephone system — typically piggybacks on the building's existing internet connectivity.
The major issues with deploying enough small cells to provide extra bandwidth are in many cases as much political and commercial as technical. Frequency allocation and non-interference with other cells in the area can be taken care of through sensing of band occupancy, central database co-ordination and direct negotiations between cells in a Self-Organising Network (SON) architecture. However, the question of who pays for the backhaul bandwidth becomes very involved once a small cell is opened up to general public use, as opposed to whitelists of registered handsets. Detecting, characterising and alleviating interference when things go wrong also has high potential to employ more lawyers than engineers — a trait wireless has exhibited throughout its existence.
LTE-D, for Direct (if you're Qualcomm, or more generally, Proximity Services and Device-to-Device communication) is a proposal that sits at the intersection of PAN and WAN. It uses LTE-compatible protocols and frequencies, but for direct device-to-device communication. For 64 milliseconds every twenty seconds, all LTE devices that are LTE-D compatible broadcast who they are and what they're prepared to do with others. Each device therefore builds up a constantly-updated map of contactable devices in its area, and can establish contact via the local LTE base station without the need for phone calls or explicit addressing on the part of the user.
White space is the most recent innovation in wireless WAN, although it too shows convergent evolution with other, more conventional, systems.
Overlaying (literally) all existing frequencies, white space uses intelligent networks to sense, analyse, configure and use channels that may be allocated to other services but which aren't actually in use. This has the potential for high efficiencies, long ranges and high bandwidth (although not all three at once), which has led to a lot of hype and hopeful posturing. The most likely applications may be rural broadband (although more traditional options are usually available there) and low-speed, high-density Internet-of-Things applications where meters and sensors report back a few tens or hundreds of bits per second on average to a central controller, and accept similarly slow commands back.
White space does lend itself quite well to developing nations, where there are far fewer existing occupants of allocated bands and, often, significant rural populations in regions of little existing infrastructure. One of the more arresting concepts is Google's 'SkyNet', which proposes to use tethered blimps and white space systems to illuminate large areas with affordable wireless broadband. Flying broadcast transmitters are not a new idea, dating back to at least the 1920s; what's new is the combination of low power, high bandwidth and physical lightness in long-duration unmanned lighter-than-air platforms.
Further out still
Automated transport, from self-driving cars to warrior drones on the front line, will make even more demands on wireless infrastructure, and will require much greater reliability and guaranteed connectivity. Research is looking at reusing existing infrastructure in new ways: a mobile phone network can be used as one component in a radar system, for example; or very dense peer-to-peer networks using short, extremely fast high-bandwidth packets can provide a secondary underlying connectivity matrix that's very robust and resistant to congestion or interference. It will take heroic levels of reliable global bandwidth for people to accept drone passenger aircraft, for example, but the components are being put in place.
Wireless hasn't finished changing the world.