# Quantum Computing - a step nearer?

Quantum computing, one of the 20th century's oddest ideas, has taken another step towards reality. If reality is the right word: quantum computing (QC), which promises massive performance gains over current computer designs, lives in the same shadowy world of interlinked probabilities and counterintuitive behaviour as do the electrons that make it work.

Quantum computing, one of the 20th century's oddest ideas, has taken another step towards reality. If reality is the right word: quantum computing (QC), which promises massive performance gains over current computer designs, lives in the same shadowy world of interlinked probabilities and counterintuitive behaviour as do the electrons that make it work.

QC depends on some key properties of electrons: they can only hold energy in discrete chunks called quanta. If you have an electron with a certain amount of energy buzzing around an atom, you can add some energy to it and it won't change behaviour until you add just enough. Then it will move to a higher energy state. In computing terms, these two energy states can be used to represent 0 and 1, the fundamental states of binary arithmetic and the basic vocabulary of processing.

If that was all electrons could do, they'd be useful for memory purposes only. However, electrons also interact with each other and with other subatomic particles in subtle yet predictable ways. For example, particles have a property called spin which is almost but not quite totally unlike anything we'd call spin. Spins can be affected by other particles in the area under certain conditions, so the spin of a carbon atom's nucleus might flip if the spin of a nearby hydrogen atom's nucleus is in the right state. That gives a quantum system the ability to make decisions analogously to the logic gates in classic computers.

The most curious aspect of QC comes when a part of the system is in an undetermined state. If you give an electron at 0 some energy, but not enough to move it to 1, it acquires characteristics of both 0 and 1. This quantum bit, or qubit, can be thought of as simultaneously 0 and 1 - and if you have two qubits, they can represent all four possible combinations of two bits at the same time. If you have 16 qubits, then they represent all numbers between 0 and 65535. Calculations with qubits produce all possible results simultaneously, effectively by creating sufficient parallel universes to contain them at the same time.

That wouldn't be particularly useful unless you could winnow out the right answer, and a great deal of thought has gone into this. One way is to give a system an electromagnetic shake which encourages the right answer to gradually become more and more probable. At the right moment, the system is looked at and this process, called decoherence, causes all possible answers except one to silently vanish (We told you this was one of the oddest ideas this century).

You will be unsurprised to hear that there are a multitude of problems in turning this idea into practical use. One is that any interaction of the system with the outside can potentially cause decoherence, making it tricky to set up and control quantum calculation. Another is that the quantum states of the constituent particles are incredibly fragile and tend to change over time, introducing errors that were thought to be uncorrectable. However, researchers at Los Alamos and the Massachusetts Institute of Technology (MIT) have just announced that they've found a way to spread a qubit over three separate nuclear spins in one molecule. This means that any changes in it can be detected without having to learn the actual state of the qubit and thus forcing decoherence - instead the researchers can just see whether any of the spins have become different from the rest. Any change can thus be fixed.

This follows many other announcements over the past two years, that have seen quantum computing move from a tiny effect seen in single electron systems held at close to absolute zero to experimental success using room temperature collections of ordinary molecules. There remain many obstacles before Intel need feel threatened, but to date the speed of development has comfortably outpaced the ability of detractors to think up new ways of saying it will never work. Expect the unexpected.