X
Innovation

Photon juggling: One big quantum processor from 100 little ones

One big pile, as Arlo Guthrie once disseminated from practical experience, isn’t really better than two little ones. Yet for a type of computer even less mindful of the law, all the little piles are already one big one.
Written by Scott Fulton III, Contributor
google-sycamore-quantum-chip-assembly-as-painting-2.jpg

The processing assembly for Google's Sycamore quantum computer.

In the not-terribly-distant past, the goal of quantum computing research was to achieve a milestone called quantum supremacy: the point in time when a quantum computer can, in practical terms, be considered superior to a classical, semiconductor-based computer for processing any task you give it. Certainly Google already made a big enough fuss about it. This is no longer true.  Engineers and scholars have since conceded that this is not possible — that a quantum device cannot supersede a classical device.  (Of course, it may seem a little too convenient that they should make this declaration now.)

The principal reason for this is not that a quantum computer (QC), once the plans for its development are fully realized, would somehow be inferior. A quantum computer is, and because of the nature of physics always will be, a quantum processor maintained and marshaled by a classical control system. Despite that title, "control" may be an imprecise word in this context. Although such devices may yet become the foundation for a new industry, they don't really control quantum processing any more than a barbed wire fence controls a prison riot. More accurately, they control the gateway leading to and from the prison, with the guards making sure to watch only the gateway and nothing else (because watching something requires photons, and photons will make the qubit stack — the core processing element — decohere.)

No, the reason is because a quantum system includes, and depends upon, a classical computer. It's tempting to say the two rely upon each other, but that would misinterpret their working relationship. Tell a QC it's dependent upon anything else, and it's liable to throw a qubit and fall apart.

What engineers and programmers are seeking now is a kinder, gentler position of achievement and authority. Some have opted for the phrase quantum advantage, which would imply that the QC has a clearly measurable virtue, in terms of performance, speed, or quality, over a classical supercomputer. Others prefer quantum practicality, which is softer still, implying that the QC would be the device one would rationally choose to perform a task, given a rational analysis of the alternatives.

"You might think, 'Well, we've achieved quantum advantage last year at Google. So it's probably a few years' worth of work to get to quantum practicality, isn't it?'" said Prof. Lieven Vandersypen, the scientific director of Dutch public/academic partnership QuTech, speaking at the recent IQT Europe 2020 conference. Google's supremacy claim was made after having provably maintained the execution of a task with a 53-qubit register. So perhaps the road to 100 qubits is paved, smooth, and unobstructed, if one takes this point of view. Prof. Vandersypen continued:

lieven-vandersypen-qutech.jpg

Prof.  Lieven Vandersypen

A few hundred qubits comes not out of nowhere… This is the point where useful problems could be addressed. On the other hand, perhaps millions of qubits are needed… or maybe a miracle in designing new quantum algorithms. So which of these applies, and how do I look at it? Certainly I don't believe for a minute that it is just a few years' worth of work to achieve real quantum practicality. If we look at the projections, indeed, we are going to achieve as a community a few hundred qubits. But these will be not perfect qubits. Then what you need are a hundred perfect qubits that will run indefinitely without error, and can carry through as many operations as are needed to really enter this quantum practicality regime. Okay, they don't need to be really perfect, but they have to be, let's say, between 1,000 and 10,000 times higher quality than any of the qubits that we can operate today. That is not completely out of the question, but for sure, not going to happen in a few years' time.

Vandersypen makes multiple references to "a few years," and not by coincidence. In the midst of a global pandemic, and an ongoing shift in the global order, a few years' worth of government and institutional funding may be all that institutions like QuTech can hope for.

What would render the entire question of supremacy, advantage, or "edginess" somewhat moot is if there were some force somewhere, perhaps a force of physics, that could make multiple QCs, and perhaps all QCs on Earth, simultaneously interoperable. This is what quantum entanglement actually is. A complete understanding of the underlying principles of a quantum information network (QIN) requires explanations that don't just border on the philosophical, but plunge head-first into the ocean of the metaphysical.

190307-quantum-island-09.jpg

Entanglement-as-a-service

Generally speaking, the laws of physics have thus far referred mainly to the explicate order. Indeed, it may be said that the principle function of Cartesian coordinates is just to give a clear and precise description of explicate order. Now, we are proposing that in the formulation of the laws of physics, primary relevance is to be given to the implicate order, while the explicate order is to have a secondary kind of significance (e.g., as happened with Aristotle's notion of movement, after the development of classical physics). Thus, it may be expected that a description in terms of Cartesian coordinates can no longer be given a primary emphasis, and that a new kind of description will indeed have to be developed for discussing the laws of physics.

                                                         -David Bohm
                                                          Wholeness and the Implicate Order, 1980

A quantum information network (QIN), if it can be built, would accomplish something that can't be done in physical reality. Not even science fiction has manifest a contraption such as this. Had Isaac Asimov any clue that such a thing might be feasible, the Robot series would ultimately not have been about robots.

matthias-van-den-bossche-edit.jpg

  Mathias van den Bossche

"You don't send information on a quantum information network," explained Mathias van den Bossche, who directs telecommunications and navigation systems research for Italy-based satellite consortium Thales Alenia Space.  "You weave entanglement correlations from one end user to the other end user. When this is available, everything in the middle disappears, and the end users discuss directly. This means you have actually nothing that is being repeated along the network, apart from the entanglement that swaps from link to link — there is no information that is repeated."

The only way to adequately convey the function of a QIN is with a ridiculous metaphor: Imagine if the state of being connected, of working together as a cohesive unit, were something you could take with you, as though a dealer in a poker game handed it to you. Own this card, and someone else's poker hand at the other end of the table is part of yours. If he has two kings and so do you, you now have four-of-a-kind.  (And so does he, but at least you know that.)

Now imagine you were playing a variant of the game where players could trade cards. Connectedness with one player's hand could be something you could trade, perhaps for a card granting you connectedness with another player's hand. To complete this metaphor, imagine you were playing this game using a kind of networking where trading the value of the card would be exactly the same as trading the card itself.

At this point you might, if I've phrased this correctly, have an inkling of an idea of what a quantum network would do. Let's give that capability a purpose: You already know how, with an electronic computer, a component interfaces with the motherboard by being plugged into its bus. That interface gives the component connectedness (some call this "connectivity," but in present-day networking that actually means something else). The theory of a quantum network is that, at the quantum level, the connectedness of two components can be communicated. The result is that you could obtain an exponentially stronger, single quantum computer using two QCs that swapped their connectedness properties with one another.

201029-stephanie-wehner-delft-u.jpg

  Stephanie Wehner

"Ultimately, in the future, we would like to make entanglement available for everyone," declared Stephanie Wehner, who leads the Quantum Internet Initiative at QuTech, speaking at IQT Europe 2020.  "This means enabling quantum communications, ultimately, between local quantum processors anywhere on Earth."

Although quantum networks join pairs of QCs, and the connection between them may be volatile, a QIN must have some continual state for it to even exist at the quantum level. So the minimum level of nodes in a QIN is 3, not 2, so that one link is always maintained. A November 2019 report by Toshiba Research and the University of Cambridge, introducing their minimal, 3-node QIN between Cambridge, Bristol, and London, remains current due to delays imposed by the pandemic. The goal of a QIN is not just to communicate quantum states between pairs of locations, but because photons are mobile by nature (you can't capture light in a jar), remember those states by juggling them from place to place like hot potatoes. Thus a quantum network is a kind of quantum memory.

Computing the optimization of paths in a QIN could possibly, he conceded, require a QC, if the behavior of the network as a whole cannot be modeled. The very idea of a quantum computer was sparked by Dr. Richard Feynman, considered the father of much of this science including quantum chromodynamics, suggesting in the course of one of his impromptu lectures that only a QC could model quantum mechanical behavior.

But quantum entanglement — the phenomenon where two atoms, once having been joined, share the state of one property regardless of their distance in space — can only be shared between two atoms anyway, insofar as we know. There will be no problem of joining several entangled QCs together, because theory tells us this is impossible.

The trick to making a QIN functional may then become a switching problem: opening links in a optical fiber chain connecting sources to destinations, like locks in a canal. Yet it might not be an impossible problem, even for a classically managed network. Quantum connections may indeed be achieved between points over long distances, so long as we accept them as multiple-step routes with stops along the way.

The reason this is important has to do with QC's future role in protecting all communications, including the conventional variety that otherwise has nothing to do with quantum. Once it becomes a trivial matter for a stable QC to decrypt any classically encrypted message in mere moments, the only thing stopping the collapse of digital communications as we know it will be a restriction of access to quantum communications. And if the history of the Web has proven anything, it's that the one way to extend the propagation of information is through a paltry attempt to seal off access to it.

190307-quantum-island-12.jpg

Entanglement distillation

Protected communications requires some form of encoded key. Because it involves photons rather than algorithms, a quantum key is purely random. What's more, it cannot be copied within the network without destroying it, and the message it protects, in the process. So the most a malicious actor can do is maybe disrupt the process, but not swipe the decrypted message.

The intent of the emerging art of quantum key distribution (QKD) is to leverage quantum mechanics' inexplicable quirks to protect all digital communication, now and into the foreseeable future. The astute reader may have already gathered that the actual protection delivered by a quantum key only works in the context of a QIN. So for a message outside the QIN to remain protected, it would require some aspect of classical encryption within the classical context — at least until everyone owns her own QC, which is extremely unlikely. Typically, a chain is only as strong as its weakest link.

Yet since 2012, there has been a theoretical framework [PDF] for pairing quantum and classical cryptography: one that leverages the QIN to authenticate the classical key. In the absence of a quantum key, its classical counterpart would be useless.

In the near future, the measure of the success of quantum computing as an ecosystem — as something more than an experiment at headline generation — will be whether an independent security organization can earn sustainable revenue as a producer and distributor of quantum keys. That may only be possible when commercial customers perceive quantum networking as something that directly improves, and probably accelerates, classical networking: the Internet (the one with the capital "I").

guha-saikat-web-15-sept-2017.jpg

  Prof. Saikat Guha

"Quantum internet, the way I think of it, will not be a brand new internet," remarked Prof. Saikat Guha, who directs the National Science Foundation's Center for Quantum Networks.  "It will be upgrading our current Internet to be able to connect quantum devices, quantum gadgets." Prof. Guha continued:

People often say the quantum internet is going to make the classical Internet faster and more powerful. We've got to be cautious about that, because adding this new quantum communication service on the Internet is actually going to put an additional classical communications burden on the classical Internet. We're going to have to support higher-bandwidth communications on the classical Internet, to be able to support this additional service we are putting on top of that infrastructure, not just in terms of the extra control plane communications traffic that has to be sustained, but also there is additional, inherent classical communication that is required for purification, entanglement distillation, quantum error correction, and so forth.

As if the metaphysical implications weren't dramatic enough, we now have some new, practical, common sense implications to deal with: Even if we achieve the theoretical objective of compounding smaller QCs together into one larger one by way of a QIN, actually getting the interconnected particles to do what we want them to do, requires the type of algorithmic optimization process that presently requires a QC itself to achieve — in other words, impractical in the classical realm. Perhaps the biggest optimization we'll need appears in Guha's list: entanglement distillation. This is where an operation involving multiple, weakly entangled qubits becomes refined into one with a lesser number of more strongly entangled ones.

As researchers from Dusseldorf's Institute for Theoretical Physics III discovered as far back as 2013, the generation of quantum keys that work over sustainable distances requires entanglement distillation — a way to better amplify the signals sent over fiber by making them clearer. As Guha suggests, this may have to be done in a classical setting; otherwise, it's a chicken-and-egg problem, where the QKD needs distillation in order to optimize the QKD.

In building a system that not only relies upon, but is leveraged upon, an as-yet-unexplained physical phenomenon where changes of state happen with perfect simultaneity, it is extremely difficult to determine the identity and location of step one in the sequence. It's the type of problem we would like to have a quantum computer to solve for us. For now, we're stuck with the best thinking machines available to us. And when reasoning on this level, the output of these particular machines tends to look more like philosophy than logic.

Learn more — From CNET Media Group

Elsewhere

Editorial standards