It was supposed to have been the goalpost for an entirely new branch of the computing industry, along with those nations pouring taxpayer funds into its creation: Quantum supremacy was the point in which a computer based around a quantum processor -- utilizing quantum mechanics instead of electronics -- reliably performed a task better than a classical semiconductor-based system ever could. At this time last year, before COVID-19 hit the fan, scientists working with Google claimed they had achieved it.
But in the clearest indication to date that the academic and commercial developers of quantum computers are on separate wavelengths, the organizer of what has quickly become the quantum industry's leading conferences, IQT Europe (held virtually this year), declared amid no opposition from his peers, that the whole idea of quantum supremacy doesn't really exist -- at least not in the way its advocates first thought.
Many believe it's time to shift away from quantum supremacy and start thinking about new ways to describe advances in quantum computing, such as 'quantum advantage' and 'quantum practicality'.
"I don't think it is a good term because it implies that quantum computers could be 'supreme' -- that is, better than classical computers under all circumstances. No-one really thinks this is likely ever," stated Lawrence Gasman, the founder and president of Virginia-based consultancy Inside Quantum Technology, and creator of the IQT Europe conference. In a note to us, he wrote, "Quantum advantage, to me, means that quantum computers are being deployed to solve practical problems in a variety of industries that could not be solved in a reasonable period of time using classical computers."
During a keynote session last Tuesday, Gasman acknowledged to attendees that some semblance of a goal post for quantum computing will very likely remain. "It seemed to me that this quantum advantage / quantum supremacy issue is really at the core of everything," he said. "If we can't really achieve a quantum computer that is better in some sense -- leaving open the question of what that sense is -- than our classical computers, then there's nothing here. No shame in that. There are technologies that just never really make it."
An alternative phrase being bandied about is quantum advantage, which is being explained in more practical terms.
"Quantum advantage is the goal of applying a quantum computer to successfully solve a real-world problem that is too hard to solve with a conventional computer," stated Dr Joseph Emerson, founder and CEO of Ontario-based QC support software developer Quantum Benchmark. "This will require, at very minimum, a quantum computer which exhibits less than one error per 1,000 operations across a fully connected system of more than 60 or so qubits."
Ironically, perhaps the most critical component of a QC system is a classical computer -- in effect, a conventional gateway that interfaces with the cloud, the internet, and the outside world. The new goalpost is the point at which a QC, acting as a kind of 'blind accelerator' governed by a conventional computer, reliably produces results faster than the same computer acting on its own. That would be the 'advantage', or perhaps the 'edge' if that word wasn't already taken.
The watchword here is 'reliably'. Quantum processors yield the results of impossibly long linear algebra computations literally by building qubits out of bombarded atoms, making these interconnected stacks perform analog-style logical tasks before they literally fall apart (decoherence). Yet rather than definitive solutions, they can only yield probable ones. They're susceptible to errors, which may theoretically be reduced through algorithm optimization, but never eliminated. A QC can be fast and it can get bigger -- its capacity to simultaneously process information rises exponentially with each qubit added to its capacity. But just like a stack of blocks, its probability of error and of decoherence rise as well.
SEE: What is the quantum internet? Everything you need to know about the weird future of quantum networks
When Google claimed last year its Sycamore quantum processor achieved "an experimental realization of quantum supremacy," according to its article in the journal Nature, it achieved a sustained run of 53 qubits in a 54-qubit array, with a fidelity rate (measured using Google's method) of 0.2 percent. Dr Emerson's minimum threshold for quantum advantage seems so close: about half the error rate with about 7 more qubits. Yet for engineers, that may be like climbing the last thousand feet of Everest.
Except there's another mountain left to climb afterward.
"To have a slight chance at achieving quantum advantage at those minimal specifications," wrote Emerson, "the quantum computer would also have to pass a large number of other performance conditions, such as those we have designed to assess this very question, including the absence of certain adverse error properties that would surely prevent accurate solutions, even though the error rate is low enough. A more realistic scenario for achieving quantum advantage is a system with a few hundred connected qubits which has less than one error per 10,000 operations and can also pass our checks against adverse error properties."
Prof. Lieven Vandersypen, who serves as Scientific Director for Dutch quantum research association QuTech while teaching at Delft University of Technology, prefers yet another term altogether: quantum practicality, which he attributes to James Clarke, Intel's director of quantum hardware. In Intel's August announcement of collaborating with Argonne National Laboratory on QC, Clarke wrote, "At Intel, we are taking a broad view of quantum research that spans hardware and software with a singular focus on getting quantum out of labs and into the real world, where it can solve real problems."
"I would go for quantum practicality instead of quantum advantage," Prof. Vandersypen told us, "since the term is unambiguous. Quantum practicality is reached when a quantum computer helps solve a relevant problem significantly faster than is possible otherwise." Folks who use other terms, he remarked, have a tendency to back up their claims with demonstrations of "solving irrelevant problems faster" -- which has long been a criticism of benchmarks, even in the classical computing realm.
Nevertheless, Vandersypen's alternative does sound, at one level, like quantum supremacy with an added coat of spackle. Yet as he told attendees of IQT Europe, his metric paves a path forward between the kind of advantage Google's researchers claimed, and the kind that may one day impact commercial users' bottom lines.
"You might think we've achieved quantum advantage last year at Google," said Vandersypen, "so it's probably a few years' worth of work to get to quantum practicality, isn't it? In other words, we need to go from 53 qubits to a few hundred qubits. And a few hundred qubits doesn't come out of nowhere. This is the point where useful problems could be addressed.
"On the other hand," he continued, "perhaps millions of qubits are needed, or maybe a miracle in designing new quantum algorithms. Which of these applies? Certainly, I don't believe for a minute that it is just a few years' worth of work to achieve real quantum practicality...Indeed, we will achieve a few hundred qubits, but these will not be perfect qubits. What you need is 100 perfect qubits that will run indefinitely, without error, and can carry through as many operations as are needed, to really enter this quantum practicality regime. Okay, they don't need to be really perfect, but they have to be, say, between 1,000 and 10,000 times higher quality than any of the qubits that we can operate today."
Quantum computing experts speaking at IQT Europe have compared the state of their industry to the rise of microcomputers in the 1970s. Back then, too, engineers achieved practical or impractical, but certainly improbable, milestones, only for someone to move the goalposts.