X
Business

IBM’s quantum announcement is a big step in a 1,000-mile journey

Does IBM’s announcement of its new quantum computing system mean that quantum computing must be on your technology roadmap?
Written by Forrester Research, Contributor

IBM unveiled a complete "quantum computing system," IBM Q System One, last week. What's more, it chose to do it at CES in Las Vegas. Should you take IBM's claim of a "commercial" quantum computing system to mean that it is time to put quantum on your tech road map? The short answer is "no" -- but, as always, there are exceptions. Since there is a lot of confusion on quantum computers (QC), we think the market needs help to stay grounded in reality.

Also: Did IBM just reveal first commercial quantum computer?

Here is my take on the announcement: It is a big step forward, but QCs are taking baby steps and the journey is a thousand miles long. To understand this, you need to let go of a bunch of assumptions. After that, you need to appreciate why IBM's announcement is significant without losing sight of the long road ahead of us. Let me break these down for you briefly.

Start By Letting Go Of Your (Digital) Assumptions

As I've been diving deeper into quantum computing, I've discovered that I have been making a lot of assumptions because the things that work for classical computing turn out to be different for QCs. Let me help you avoid the same mistakes.

  • Universal QCs don't have RAM or CPU clocks. Digital computing power is measured by how many 1s and 0s you can process and how quickly bitsets are processed in and out of CPUs using RAM as a storage buffer. QCs work with states between 1 and 0; however, there is no way to write these rich, complex intermediate quantum states to "quantum RAM." Thus, you can't think of QCs as working the same way that digital computers do because QCs cannot do sequential processing.
  • Universal QCs today are batch problem solvers, not general-purpose programmable systems. We naturally assume that a universal QC is programmable and can do many different things using instructions, loops, or object calls. In fact, QCs require a completely different approach to programming logic. Without RAM, QCs are more like batch problem solvers. Digital data goes in, quantum magic happens, and digital data comes out that represents a probable answer to the input problem. This is good for computationally intensive tasks in optimizations, materials and chemistry, some machine-learning tasks, engineering, etc. It's not good for software as we typically understand it.

Why Q System One Is A Big Deal (And Why It's Not)

With this information as background, you can start to see why IBM's announcement is significant and why it's not the bombshell that some media outlets imply. First, because it's a completely new computing paradigm, it's best to think of our current state as somewhere between Turing's WWII machine and the first mainframe. There is a lot of system and software engineering foundation that needs to be developed and given time to mature.

Here is what I think is significant about IBM's announcement and what is perhaps a bit of hype:

  • Significant: It's a system. This is the most important thing about IBM's announcement. Up until now, quantum computing systems have been stitched together the same way we built computers in our garages back in the 1980s. IBM is claiming that it now has a fully integrated computing system. This is very important, because instead of optimizing components, IBM can how start to optimize the system as a whole. Until we can start engineering integrated quantum computing systems, we really can't drive consistent increases in power.
  • Significant: It's modular. Not only is it a system, IBM claims that it's modular. That means components can be upgraded separately, which is big because it starts to make the quantum system commercially viable. If you wanted a more powerful version, you don't have to buy a whole new multimillion-dollar box. Given the exponential effects that could dramatically increase quantum computing's power over the next few years, being able to swap out for new components to increase performance is a good thing.
  • Hype: It's the "first commercial QC." This is where I urge caution. IBM likes to tout being the first at things. And many times, it is -- but its claims also rev up the market, sometimes too far in advance. I can see the sense in IBM's claim because (A) you can buy it if you have deep pockets and (B) its system approach and modular design make it more commercially viable. However, IBM's claims can be interpreted as "ready for primetime" problem solving. That's far from true, as I cover next.

Also: IBM warns of instant breaking of encryption by quantum computers

How Far We Have To Go

We have a long way to go before quantum computers can solve real problems that businesses have. No QC today (universal or annealing) can do anything better than a digital computer simulating a quantum computer. That is very, very important.

No quantum computer today can do anything better than a digital computer simulating a quantum computer.

What's more

  • Researchers don't know when QCs will be commercially useful. The technical term is "achieving quantum value." This means that on a cost-for-performance basis, QCs can solve a problem more economically than a classical computer. All the work being done today is searching for quantum algorithms that might someday (hopefully sooner rather than later) be commercially viable. In our report, "A First Look At Quantum Computing," we estimate that it will be three to five years before we see mainstream applications exploiting QCs to solve a few select business problems.
  • It's likely to take 10 to 20 years before QCs can break PKI encryption. The holy grail of quantum computing is something called error-corrected universal quantum computers. Today, universal QC qubits only stay stable for a short time. Due to reasons mentioned above, you must solve your problem in that time or go back and start over. If vendors can design sufficient redundancy, they can theoretically create a stable, logical, error-corrected qubit. For reasons I lay out in my report, "Quantum Is Not An Immediate Security Threat," this is likely to take 10 years. Until then, we won't have QCs that can factorize sufficiently large numbers to threaten PKI encryption.
  • Quantum machine learning is still largely theory. One of the most exciting use cases for QC is its application to improve machine learning. However, machine learning today involves a lot of optimization and vector math operations that QCs, because of their limitations, don't do very well. Experts are searching for quantum machine-learning algorithms that replace or supplement traditional iterative approaches, but this work is very early-stage. My point is: Researchers don't know exactly how transformational QC will be with AI. On the other hand, Google's quantum moonshot is called the Google Quantum AI Lab, so we know what the tech giant thinks.

What should the smart CIO do? First, read our previous research to get your team smarter, then give us a call to help you identify opportunities to investigate further.


-- By Brian Hopkins

For more from Forrester on emerging tech, click here.

This post originally appeared here.

Photos: From the first PCs to the ThinkPad – classic IBM machines

Related stories:

Editorial standards