From 3D chips and carbon-based transistors, to photonics and memristors...
The computer industry is not known for standing still. Moore's Law, coined by Intel co-founder Gordon Moore, famously states the number of transistors that can be inexpensively packed onto a chip doubles roughly every two years - a principle that has held true since at least 1975.
Silicon transistors underpin the whole of modern electronics. With millions on a modern integrated circuit, they are the tiny binary switches working in unison that enable our machines to perform complex calculations at such high speeds.
Invented in 1947 by Bell Laboratories, early transistors were discrete electrical components that measured more than a centimetre across. In the decades since their invention, transistors have been getting smaller and smaller, moving from separate components placed on circuit boards to being fabricated directly on silicon wafers as part of the integrated circuit upon which our digital infrastructure is built.
Today, Intel sells a microprocessor with two billion transistors crammed onto it - with each one measuring just 65 nanometres (nm) wide. A nanometre is one billionth of a metre. Even smaller transistors - 32nm and 45nm wide - have also been commercialised. Intel's not stopping there either. It has 22nm transistors in its pipeline, and is sizing up even smaller gates in the years ahead.
However, increased transistor density is dogged with questions about how to manage heat extraction. With more and more energy required to power the chips, the amount of heat generated also rises, requiring chipmakers to come up with novel ways to extract it or risk chip and device malfunction.
Shrinking the size of transistors to pack more onto a chip has been the mechanism driving the computer industry for decades, enabling smaller, faster and more powerful hardware to appear every few years. But Moore's Law as it stands can't go on indefinitely. Miniaturisation can only go so far before it reaches the ultimate, atomic limits.
Molecular-scale production could become possible in the next decade, allowing individual atoms to be precision-placed to build transistors measuring just a few nanometres across. Even so there comes a point when scaling silicon effectively bottoms out, owing to problems such as electron tunnelling - a quantum mechanical phenomenon that means transistors with gates less than 5nm wide become unreliable.
With the current silicon-based transistors heading for the atomic end of the line, new techniques and technologies will be needed to underpin tomorrow's computing hardware.
Creating 3D chips
One idea being examined to extend Moore's Law is to layer microprocessor cores on top of each other, increasing the density of transistors by building 3D chips rather than scaling down individual transistors.
Such 3D chips would be made by stacking layers of silicon circuits on top of each other, joined using vertical copper interconnects, known as through-silicon vias (TSVs).
However, stacking silicon is no layer-cake walk. There are many design challenges facing TSVs, which include heat extraction; reducing chip bulkiness and mechanical stress; and the need for industry to agree on standards so chips can be designed and built in existing fabs.
Stan Williams, founding director of the Information and Quantum Systems Lab at HP Labs, is rather downbeat about the prospects for 3D silicon chips, describing them as ...