Quantum mechanics (QM) is one of modern physics' best theories. Since it was established in a blaze of intellectual discovery during the first half of the last century, it has proved itself able to predict and describe a huge range of phenomena. Although its influence on IT is less than might be imagined — most semiconductor theory would work if expressed in older ways — we are rapidly advancing into technologies where QM is essential.
Of late, there are plenty of advances to report, some of them already out of the labs and in commercial products. Quantum encryption — where any attempt to copy data is impossible to hide — has already reached the second generation of products from companies such as id Quantique and MagiQ Technologies, with established names like NEC and Toshiba also demonstrating successful systems working on a wide range of data types.
Quantum encryption works thanks to Heisenberg's Uncertainty Principle, which declares rigidly defined areas of doubt and uncertainty — the better you measure one aspect of a quantum object, the less you can know about one other. Photons of light have orientation, the waves of light either going up and down or slanted at plus or minus 45 degrees — and data can be encoded in them by simply deciding that up is one and down is zero, or similarly with left and right slants.
But, says Heisenberg, if you measure a photon looking for whether it's up or down, you'll get a result — but you won't be able to tell from that measurement whether that photon really was up or down or whether it was in fact slanted. Some of your results will be accurate, and some wrong. Quantum encryption establishes a second channel between the sender and receiver, so as well as encoding the data on the photons on the first stream the sender also tells the receiver in an independent message which ones are slanted and which ones up/down — but not what the data in them was.
Thus, the receiver knows which of its incoming photons have been correctly received and can discard those which were wrong. Any attempt to monitor the traffic streams — which means the photons must be destroyed and regenerated — will create a recombined data stream where the values are right but the polarisation orientation is randomly wrong. Statistical analysis of the data stream by the intended recipient will show a characteristic divergence, and the data can be discarded as untrustworthy.
In practice, this method is used to distribute keys with the receiver telling the sender what photons were correctly received, thus synchronising the keys at both ends. A truly secure key distribution system removes the reliance on the security of an encryption algorithm, as once you can rely on your keys you can use the simplest of mathematics of combine key and data in an unbreakable way. This all works, is already deployed for financial and government users, and is being actively developed to work over long distances and over wireless. Radio signals are photons too.
Quantum computing is further away from real life, however. It uses superposition, a related aspect of QM, which states that before you measure an aspect of a quantum system that system exists in all possible states simultaneously. Each state is superimposed on the others. When you measure the state, the other possibilities vanish — the state is said to collapse — but until that point, there are multiple parallel realities. By maintaining a situation where those multiple parallel realities can interact with each other, you can simultaneously generate all possible outcomes. Encode an algorithm into the quantum system, and instead of having to laboriously work through all the possibilities in turn you create all the answers at once. When you look at the system, it collapses and you're left with a result which is probably right. Do it enough times, and you can be sure.
There are plenty of problems here, with perhaps the biggest one being the small detail that if the system interacts with anything else during the process it will prematurely collapse, a process known as decoherence. Nevertheless, quantum computing has been demonstrated — most famously when IBM implemented a small experiment that finds the factors of a number through a process called Shor's Algorithm. This has profound implications for classic encryption, which relies heavily on such factorisation being impracticable for large numbers, but also opens up many fascinating areas for mathematical analysis of systems where the numbers are beyond sequential searching. This stuff works.
So why would this upset the Roman Catholic church?
Since Galileo, Rome has been increasingly careful to accept the evidence from science and technology as an accurate description of the universe, contenting itself to deal with the spiritual implications of what is thus revealed. In general, this has worked well enough, allowing scientists to be religious without cognitive dissonance and removing from the Church the spectre of once again having to deny something that is empirically demonstrable.
But there are heavy political pressures, most notably in the US, to change this. The greatest and most bloody battlefield is biology, where a considerable movement to deny evolution is conducting a well-funded campaign to convince people that a religious interpretation of data is preferable to that held by mainstream science. That interpretation, that a mysterious intelligent designer is at work coordinating events through miraculous means, remains primarily religious despite protestations otherwise; the science produced is profoundly unconvincing to those not already disposed to a creationist viewpoint.
So far, the Catholic Church has instinctively left this well alone, with the official viewpoint being that evolution occurred much as mainstream science indicates. That may be changing. In an opinion piece published in the New York Times late last week, Cardinal Christoph Schönborn, archbishop of Vienna and close associate of Pope Benedict XVI, said that both evolution and the multiverse hypothesis were deliberately invented to avoid 'overwhelming evidence for purpose and design' found in modern science.
Evolution may be outside the gamut of IT — although evolutionary or genetic algorithms are increasingly used in computer-aided design, it is unclear whether these attract Schönborn's displeasure nor whether he'd refuse to fly on an aircraft which had its flight surfaces created by evolution — but the multiverse hypothesis is closely coupled to the sort of QM at the heart of quantum computing. Since all those multiple parallel realities actually exist, there is good reason to believe that the set of collapsed realities we observe is just one among very many. At a cosmological level, this sort of thinking leads to the idea that multiple universes simultaneously exist — which may sit uneasily with certain interpretations of Christian dogma, but is no more informed by a desire to deny its theology than a palaeontologist would be moved to raise their pick with a grim determination to advance the cause of atheism.
Yet it is difficult — and by the rules of science impossible — to take just one aspect of modern physical thinking and declare it incorrect without touching upon its related fields, unless a specific and analysable flaw can be found in the logic that connects them. If that flaw can be found, then science will be grateful for the information. If there is no such flaw, then it is hard to see where the ex cathedra condemnation of sinful thought will find its natural end.
Science and technology live and die by the consideration of objective, testable facts. It will be a bad day for everyone if we make any return to the times when new ideas are declared anathema because they make theologians uncomfortable. While matters of conscience must always inform how we choose to work, the realities of the physical world must stand as the basic touchstone of scientific endeavour. We've come too far to risk that, or to accept that the only valid computer is one with the Vatican's seal on the processor.