What the Google vs. IBM debate over quantum supremacy means

Google claimed in Nature magazine it has achieved “quantum supremacy” over classical computers with its Sycamore chip. IBM contested the scope of Google’s achievement. Both are right, but you have to give the edge to Google.
Written by Tiernan Ray, Senior Contributing Writer

Everyone loves a battle between two worthy champions where both combatants come out on top. 

That's what happened in the past two weeks between Google and International Business Machines in the fight over what the term "quantum supremacy" means. Both sides showed intriguing, valuable research. 

In Google's case, it's about the physics of making a superior device. In the case of IBM, the company shows that "architecture," the design of a traditional computer system, still has amazing potential to advance computing.

To recap, Google last week revealed in Nature magazine the results of its "Sycamore" superconducting computer chip, which was able to measure the output of a random number generator one million times in roughly three minutes versus what Google estimated would take 10,000 years to do using a conventional electronic or a "classical" computer. (Google also posted a blog item describing the work.)

Also: Google: We've made 'quantum supremacy' breakthrough with 54-qubit Sycamore chip 

A week before Google's report was published, IBM issued its own report, based on a leaked version of Google's research, stating that Google hadn't achieved quantum supremacy. IBM claimed that, in theory, a supercomputer using conventional electronics could do the task not in 10,000 years but two and a half days.


A Google artist's rendering, on the left, of the Sycamore chip in action, and a photo of the Sycamore chip.


What's going on here? The debate is over what it means when you run an actual quantum computer, such as Sycamore, and compare it to a simulation of that quantum computer inside of a classical, electronic computer.

Quantum simulation software, such as Microsoft's LIQUi|⟩ program, allows a traditional computer to represent a quantum computer in ordinary circuitry, by translating quantum mechanics into mathematical structures, known as matrices of complex numbers (numbers that incorporate both real and imaginary numbers). 

Also: Has Google really unlocked quantum supremacy? Not so fast, says IBM

With simulations, it's possible to compare how long it takes real quantum circuits to produce a given computation, and how long the same computation takes a classical computer to reproduce, by running the matrix math that resembles the functions of the quantum circuit. 

Google and IBM are both looking at such simulations, and they're taking different views as to what the comparison means. 

Google's point is that Sycamore is a device that does the work it takes millions of conventional processors to simulate. As the authors state, when they simulated even a simplified version of the random number generator on the classical computer, it "takes one million [conventional computing] cores 130 seconds, corresponding to a million-fold speedup of the quantum processor relative to a single core." Google ran its simulations on the Jülich supercomputer, in the German city of that name, and also on Google's own cloud computing clusters. 

So, Google is showing a better mousetrap, a device with physics that are simply superior to that of silicon circuits. 


Google's paper in Nature shows how much time it took a classical, electronic computer to simulate the quantum operations. On the left are all the simulations Google was able to accomplish, while on the right is where things got simply to complicated to even simulate, Google claims. 


IBM, on the other hand, didn't run any actual simulations. Instead, the company came up with a model on paper, a theoretical estimate for how long it would take to simulate Sycamore on the Summit supercomputer at Oak Ridge National Laboratories. In other words, IBM has put together a thought experiment. 

IBM's insight is that rather than performing all that matrix math in DRAM, the math can be broken up into sub-tasks, and some sub-tasks can be stored on disk and only pointed to during computation. It's rather like how computer systems have traditionally had to "page-out" to disk when DRAM got filled up. 

IBM is arguing that the architecture, the way a computer's resources such as chips and memory and storage are combined, can be done more intelligently to get around bottlenecks -- in this case, a lack of sufficient DRAM in each compute node with which to work on the intermediate products of matrix math. 

IBM has a whole laundry list of techniques, both architectural and algorithmic, including things such as the "exploitation of separable gates via a hyperedge representation of the tensor network." Quantum is filled with such heady stuff. 

So there you have it: Two perspectives that are both right. Google emphasizes the arrival of a new kind of device exploiting physics to go beyond the device capabilities of silicon transistors. IBM emphasizes that at any moment in time, it's always smart to architect a system to make the most of what resources you do have, be they compute, memory, or communications. 


A graphic from IBM's paper contesting Google's quantum supremacy. The graphic shows the tendons, the mathematical objects on which the classical simulation operates, being broken down into clusters of sub-tasks, which can be an approach to dramatically speed up the time it takes the classical computer, IBM claims. 


It's a triumph of science and engineering that two formidable organizations both offer terrific suggestions that are probably ultimately complementary. 

But what about that quantum supremacy question?

Well, you have to give the edge to Google. The point of quantum supremacy, or quantum advantage, if you prefer that term, is the distinction in practice between what can be done with a quantum system and what can be done with a classical system. 

That's the realm of complexity theory. UT Austin scientist Scott Aaronson, who was one of the outside reviewers of Google's paper, has made this point nicely in his writing. 

As Aaronson has written, fancy engineering doesn't resolve the matter of quantum over classical. "The exact complexity of a problem might depend on 'low-level encoding details,'" he has written, but "where a problem falls on the polynomial/exponential dichotomy can be shown to be independent of almost all such choices."

What he's saying is that you can go to great lengths to make a classical computer better, as IBM does, but that won't necessarily erase the essential difference between that classical machine and a superior system. (Aaronson is working on a paper about the potential near-term applications of Google's quantum random number generator, according to Google. Aaronson also wrote an opinion piece in The New York Times today celebrating the Google achievement.)

The Google authors hint at Aaronson's point toward the end of their paper, stating "We expect that lower simulation costs than reported here will eventually be achieved, but we also expect that they will be consistently outpaced by hardware improvements on larger quantum processors."

Seen in that light, IBM's argument for better architecture, as smart as it is, seems a little like how some people in the Middle Ages tried to maintain the myth of the sun circling the Earth. They kept adding ellipses to their models of the solar system to make the math match what they observed in the night sky. 

It was all very ingenious, but eventually, a new view, backed up by an irrefutable system of science, swept all that aside. 

Editorial standards