Quantum as a service: How to product-ize a hole in space and time
There are many things people can do to advance the cause of humankind and push the boundaries of knowledge. But at some point, they’d better earn you a buck or two. Can a quantum computer really be a cash cow?
The whole point of attempting to build a quantum computer is to exploit a strange, yet experimentally verifiable, characteristic of atoms to produce a new and significantly faster class of calculating device. If a quantum computer is truly feasible, as engineers believe it to be, it will be able to estimate the solutions to extremely sophisticated mathematical models at a fraction of the time consumed by conventional, digital supercomputers -- in some cases, at least theoretically, minutes rather than months.
If it accomplishes that, there's little doubt that a quantum computer could also be used to break the strongest encryption codes, since "brute force" trial- and-error methods would be unnecessary with quantum algorithms.
However, all this assumes that such a computing device could ever be made practical. Quantum architecture would exploit phenomena at the subatomic level that, even when explained by the most prominent physicist, sound entirely metaphysical. A consumer-grade quantum computing device will have to have accomplished a means of automating the bizarre process of constructing a computer out of atoms, setting up the algorithm in a way that it can't detect what set it up, enabling the atom computer to self-destruct ("decohere"), and then picking up the solution to the algorithm from the subatomic ashes. The moon shot had a greater probability of success. Still, engineers have plans for how they'll capitalize on their good fortunes.
How many times has this happened to you? You're up late at the laboratory working on a critical problem: Will the Earth cross the global warming point of no return in 2036 or 2040? You've spent all day kicking the climate change deniers out of the building. Now, you only have a few days left to assemble the report, and your last simulation used a prediction model that some of your major funding sources claim was debunked.
You could run your simulation through a Top 500 supercomputer, but you know that once it's done, you'll only have hours, maybe minutes, to prepare the final statistics. Your alternative is to trust the quantum computer with the key algorithm upon which the entire simulation relies. But you have to consider all the tradeoffs:
Yes, the quantum computer (QC) would be much faster. But however and wherever you run it, you'll be paying a premium for its time.
The final result rendered by the QC has a small chance of being erroneous. You wouldn't know for certain unless you were to schedule three jobs, and accept the results of the two that match. (There's no way to plan for a cosmic ray bombardment, you see.)
Even when the result is not erroneous, the probability of its accuracy is less than 100 percent. Maybe that would be the case for the supercomputer simulation too, but you wouldn't know which estimate would be more accurate unless you ran them both.
The algorithm which the QC would run has to look like something else entirely -- a geometry problem or a waveform transformation whose variables coincidentally contain parameters that pertain to the simulation. And you have to write it in a mathematical language unlike anything you've ever used in programming before.
You can't debug the algorithm when it's running in the QC. You could test it first in a (slower) QC simulator running on a conventional supercomputer... but, if you had the time to do that, you wouldn't be in this quandary in the first place.
Then, there's the entire matter of how this device would be operated. The principal theory of physics which quantum computing leverages to do what it does (some juries are still out) is derived from the same quantum entanglement concept that physicists use to explain the dynamics of black holes. When Dr. Richard Feynman brilliantly proposed the idea of quantum computing, it was more as a thought experiment to help colleagues and students rationalize the dynamics of behavior at the subatomic level -- manufacturing holes in space and time, if you will, for dummies.
So, let's just say that your QC would not fall under the purview of your local, CompTIA-certified IT department.
"A quantum computer today is likely going to be housed in a dilution refrigerator," explained James Clarke, director of quantum hardware for Intel Labs. He's talking about a mechanism that the Cryogenic Society of America describes as producing a bath of rare helium atoms cooled to below 1 degree Kelvin. They're stable helium atoms, but the only way to produce enough of them to make such a bath is through marshalling the decay process for tritium, a radioactive isotope of hydrogen that used to contaminate drinking water back when America's nuclear power plants dumped their waste into rivers.
"They're about the size of a 55-gallon drum," Clarke continued. "They either hang from a ceiling or some sort of fancy enclosure. And what you don't see is that they're usually attached to a small bank of high-performance servers. Likely, a large-scale quantum computer that's going to change your life or mine, at least in the near term, would be something that might have a small supercomputer next to it. That might be the form factor."
Escape to Alcatraz
The greatest irony about so-called "disruptive" technologies is that they have to be marketable -- they must conform to some template of consumers' expectations. Nothing can disrupt a market in which it can't participate in the first place. If anything has shown a capacity for disruption over the past century, it's quantum mechanics. Unlike any other science in human history, quantum has not only disrupted our existing notions of the structure and order of the universe, but it has presented us with new behaviors that preclude us from replacing those old notions with new ones that make the slightest shred of sense.
A quantum computer would allow us to automate new approaches to mathematics, leveraging those very behaviors we don't yet comprehend. The idea of the QC was meant to be exactly that: An idea, or a thought experiment like the kinds Albert Einstein and Niels Bohr would present their colleagues and students. As it progresses to becoming a fully operational device (it's not there yet, though it's still on its way), the presumption has been that it will take its place alongside the supercomputers, cloud servers, and hyperscale data centers of the modern era -- as components of a healthy and growing IT marketplace.
Yet that presumption overlooks the reality, or lack of a reality, of the device's own behavior. At some point, the people who would collectively comprise a market of appreciable size and quantifiable demographics, will want to consume the product that a QC makes available. People don't "consume" computers (at least they shouldn't), though in an economic sense, they are the consumers of its applications. For consumers to make a product or a service part of their lives and their work habits, it needs to be practical, reliable, cost-effective, and a steady generator of business value.
Imagine if you lived in a world where such a desirable thing was produced in an extraordinary way by an unusually talented and clever genius. But that genius was being held in solitary confinement in an exclusive penitentiary. For this person to be psychologically functional and reasonably sane, his environment must be artificially maintained to such an extent that he believed he was an astronaut, suspended in a pressure suit in the depths of outer space.
To communicate with this prisoner in any way -- to provide him with the smallest digit of information that conveyed a signal that he was, in fact, a prisoner and not an astronaut, would be to risk driving him so insane that he could die. So, the only way to hire him would be by employing his guards as proxy agents. To you, they're prison guards; to the genius, they're Mission Control.
This genius can, when all is right with the world, operate immensely fast, as though he could conjure his handiwork from the depths of the Big Bang itself. But if he hears footsteps in the hallway, or the sounds of a delivery truck outside, or if his food isn't processed to resemble an astronaut's rations, he collapses into a puddle on the floor.
What's more, whatever he is asked to do -- find the error in a corporate ledger, study a million handwriting samples for evidence of fraud, detect the moment in time when an asteroid may have wiped out the dinosaurs -- must be presented as an itinerary from Mission Control for some rudimentary spacecraft maneuver, such as a course correction or a cryogenic tank stir. Some strange by-product of the agenda given this genius will be the actual job he's been hired to perform, and he can't know it while he's doing it.
If this genius were to become agitated, he could accidentally -- almost unwillfully -- break out of prison. For that reason, his guards must be rotated continually, and the number of guards on duty is rarely the same for any given minute of time. Yet each one must be capable of mimicking someone at Mission Control, for if he's overheard sounding like a prison guard or an ordinary person, the prisoner could throw a fit, becoming a small wad of human jelly.
This is the conundrum any prospective consumer of a quantum computing product or service will face -- certainly not with the same actors, but with very similar consequences. If a QC has any market value at all, it will be in spite of the very behaviors which, on a good day, are the heart of its value proposition.
Floating in a most peculiar way
The core calculating component of a quantum computer is something that, to borrow a phrase from Monty Python, cannot be seen. Indeed, its complete compartmentalization from reality (or, to be fair, the Kantian phenomenal reality of cause and effect) is why it works. The logic gates of a quantum computer, unlike the Boolean logic gates of a conventional semiconductor, enable qubits (the analog of bits, or binary digits) to represent 0 and 1 simultaneously, instead of one or the other. It's this simultaneity which enables the speed advantage that makes quantum computing intriguing to begin with.
"You should interpret a quantum circuit just as a circuit that you would implement in a traditional computer, just that the operations -- the quantum gates -- are operations that create 'probabilities with a minus sign,'" explained Dr. Simone Severini, Amazon AWS' director of quantum computing, in an interview with ZDNetScale.
"In particular, the quantum circuit will operate on quantum mechanical states which may be in a superposition -- which does not occur on traditional computers in the classical world."
Inside their own universe, qubits can attain the superposition states necessary to represent more than one binary unit of information at a time -- where they can dream, as it were, the impossible dream. Here, 2 raised to the number of qubits in the system, reflects the number of possible states any one qubit may attain. Reaching a final result marks the end of that dream state, and a return to the classical world.
Suppose you scattered tens of thousands of index cards on the floor, but then you needed to find one in particular. As an inhabitant of the real world, you would expect to have to pick up each card one-by-one, with the possibility that the one you're looking for is the very last card. So, for any set of n cards, the number of possible tries O would be O(n). But if you were in the quantum world, with enough qubits, the act of trying one index card would be superposed with several other such acts. In a perfect system, then, the maximum number of possible tries would be far smaller: O(square root of n).
Superposition makes the "machine language" of a QC -- the basic symbology with which it receives its instructions -- fundamentally different from a binary digital system, including on a philosophical level. With a conventional ("classical") semiconductor, various arrangements of the Boolean gates (AND, OR, XOR, NOT) produce the results of the lowest-level tasks that a computer is asked to perform. Higher-level tasks are mainly composites of low-level tasks. A QC utilizes an entirely new and different set of gates, the arrangement of which can be made to represent higher orders of task that are simpler to explain, but much harder to execute -- for example, a mathematical transformation of a waveform, or locating an element of data in an unordered list.
So, the cheese at the end of the proverbial maze is arguably quite appealing. Obtaining that reward, however, will require QC operators to pull off something of a masquerade ball, the likes of which have never been tried before, either in computing or anywhere else.
You see, until now, no one ever has tried to build a commercial product around the behavior of quantum systems. Such behavior could be compelling to observe. The trouble is, observation is physically impossible: No one can watch a superposed qubit.
By "watch" in the above sentence, to risk sounding too much like Lemony Snicket, we mean any kind of activity that renders information about what's going on. A camera would be one example, of course, as would a pair of eyes. But an interface is another example -- anything plugged into a QC can, in effect, watch it. And that would make the calculating components fail to calculate.
"You can monitor the end result and get an output," explains Bill Vass, AWS' vice president for storage, automation, and messaging -- who oversees Amazon's quantum projects. "I think the challenge we have with today's quantum computers is the stability and noise that occurs during all that. There has to be a lot of error correcting done on the machine. So, a lot of the qubits we'd like to use to solve a problem will actually be used to do error correction."
If a QC does work right, we'll know it when we see the results of its calculation. But we can only see those results after we have reason to believe the calculation is finished, because the very act of observing the operation in progress -- an act that any kind of interface, directly or indirectly, enacts -- causes the whole operation to collapse. It's an event physicists call decoherence, and at the subatomic level, it means exploding.
"A quantum computer is a very fragile device, because it's naturally affected by noise," stated AWS' Severini. "If you want to protect the information processed by a quantum computer, you need to have an enormous amount of redundancy -- quantum error correction. It's a fundamental problem in quantum computation."
Although qubits in a QC assume the role of RAM in a classical computer, Severini pointed out, those physical qubits effectively generate "logical qubits" -- transformed states of themselves -- throughout the duration of the computation. "When you design error correcting codes, you may end up with thousands of physical qubits for each logical qubit, in order to protect information during the computation."
And you thought managing Kubernetes was hard.
Conducting a mismatch
Amazon -- the company to which the whole notion of "hyperscale" is most often attributed -- is unafraid of these technical, or even metaphysical, hurdles. Its engineers perceive the relationship to be a kind of Apollo + Soyuz pairing, wherein the linking component to which both the classical server network and QC are connected, maintains the physical separation between both entities, while still enabling the rendering of results.
"It's very natural to use a classical computer and a quantum computer together," asserted Dr. Severini. "For example, if you consider a number of algorithms for quantum computers, only a small part really needs to run on the quantum computer, whereas a majority of the algorithm can run -- in fact, should run -- on a classical computer. Moreover, there are certain types of algorithms that require a classical and a quantum computer working together, because you parameterize the quantum circuit using classical computation, and the computation is going back and forth between classical and quantum. In a way, there's no specific issue that comes from the physics for using classical and quantum computers together in tandem."
In a metaphorical sense, though, getting information out of a quantum prison would begin with feeding your instructions to a cloud data center (4 in our diagram), such as Amazon EC2. (You could imagine Amazon wanting to implant a QC onto your enterprise's premises the way it uses DeepLens to infiltrate folks' homes, though just the zero-Kelvin refrigerator might be a stretch.) The data center would reformulate the instructions it receives into quantum algorithms, which it then passes onto the guards operating the facilities (5).
Remember, though, to the quantum processor, the instructions have to look like something quantum-y, like a Fourier transform, and not a traffic simulation. So, metaphorically, the guards pass these instructions on to the astronaut, who will at some point collapse from exhaustion. Before he does, he'll throw a fit, the shock waves from which will be detected at the output center (7). And those waves will correspond to the solution to the algorithm.
That's one way to perceive quantum computing as a service: a system that serves the customer in such a way that she never sees what goes on behind the scenes.
Another interpretation, with which Intel is very familiar, places the QC in the role of a co-processor, like the GPU that processes graphics for desktop PCs. A co-processor knows to expect certain operations, and the CPU passes control of the thread to the co-processor when one of these operations appears in the code. Theoretically, a classical system could pass control to a QC when it encounters a set of operations that quantum gates can calculate, presumably faster or better than Boolean logic gates.
Yet this cheerful, hope-filled interpretation of the relationship masks the fundamental dysfunction at the heart of any quantum computing system: It cannot "know," or be informed, of the existence of this outside world. What a QC device actually does, as incredible as it still sounds, is build a computer at the atomic level, according to instructions. The builder can "see" the instructions, but the atomic computer cannot. Once the little computer is brought online, it must be left entirely alone, for however long it intends to take. The builder knows the computer's work is done when the atomic computer literally explodes -- an event which, thankfully, does not trigger a chain reaction, but which does leave behind the only evidence of the little machine's existence.
"When you measure something, you do lose its coherent state, but you actually gain a measurement," explained Intel's Clarke. "You would manipulate the qubit with operations in a certain way, with electrical, microwave, or optical pulses. And still you're maintaining your quantum state, but you're manipulating it. But once you measure it, by any number of measurement techniques, you collapse that state to a classical state."
It's this zero-touch manipulation that assembles the atoms into a state where they behave as qubits in a small machine. It's the only interaction this machine can have with the classical world.
And it's this separation of powers that messes up any hope one might have of a QC eventually attaining a practical, ordinary form factor. Here, instead of befuddlement, is where Clarke ends up drawing inspiration: an opportunity to follow the lead of a supercomputer pioneer who tackled a problem not unlike this four decades ago: Seymour Cray. If it's the sort of machine that would draw a crowd anyway, why not charge admission?
"The first quantum computers will resemble the first Cray computers from the mid-70s -- the Cray 1, for example," he told ZDNet. "That was a very fancy- looking tool. Oftentimes, when a company or a national lab would have a Cray, they'd proudly display it behind some sort of glass structure that visitors could see."
Metaphorically speaking, it's an attempt to give the QC the appearance of a "front door" entry. You produce an algorithm directly (1), then feed it into what's treated like a co-processor (2). But it's not a co-processor of the QC; rather, it's connected to whatever other processor acts as the guard tower around back (5). It then feeds the instructions to the imprisoned astronaut (6) who has a hissy fit that's felt by the seismic detectors at the output center (7).
It's not surprising that Intel would envision a QC looking and functioning very similarly to what Intel has manufactured throughout its history, and that AWS would foresee a service that fits nicely into the Amazon cloud. But customers of data centers and the products of data centers expect a quantifiable degree of reliability (e.g., "five 9's" or "six 9's" of uptime percentage) which quantum cannot provide without someone deftly rewriting the laws of physics.
AWS' Vass foresees a solution to this dilemma as well -- specifically, by providing uptime and reliability guarantees for the system as a whole, which would include classical servers (it has to include them; there's no other way).
"[For] quantum computers, we'll have to set an expectation for what can and can't be delivered," said Vass. "But with any computer, for any algorithm you run on it, you may not get the output you want. . . You don't expect 100 percent search results on Google. I don't think quantum computers will be any different, in that aspect."
Vass makes an excellent point, and maybe he made it by accident -- but for a quantum topic, that's rather poignant. When you pay a premium for service, you expect guarantees of reliability. Perhaps quantum computing would be better suited for a more menial, general-purpose task, such as searching for information or choosing a head of state -- something for which people, perhaps by their own design, would never expect perfection anyway.
If quantum interactions truly are responsible for the state of the universe, then maybe it's through settlements such as this that they get away with as much as they do.
The tech that changed us: 50 years of breakthroughs