X
Tech

AWS's new quantum computing center aims to build a large-scale superconducting quantum computer

The cloud computing giant has joined Google and IBM in the race to build a universal, fault-tolerant quantum computer.
Written by Daphne Leprince-Ringuet, Contributor
img-0085-min.jpg

An AWS quantum hardware engineer works on a dilution refrigerator. The performance of superconducting quantum devices relies heavily on precise wiring configurations to minimize thermal fluctuations that contribute to noise.

Image: AWS

Cloud computing giant AWS is expanding its presence in the quantum industry with the opening of a shiny new Center for Quantum Computing in California. Here, top academics and engineers will be working to build the company's very own superconducting quantum computer.

Located northeast of Los Angeles in Pasadena, the two-storey, 21,000-square-foot facility was first announced by AWS in 2019 and has been built over the past two years in partnership with the neighbouring California Institute of Technology (Caltech). Caltech researchers will be part of the center's technical team, and together with experts from Amazon and from academic institutions, will lead AWS's efforts to build a large-scale, fault-tolerant quantum computer.

The new building includes office space for quantum research teams, as well as laboratories equipped with the specialized tools -- ranging from cryogenic cooling systems to wiring -- needed to build quantum hardware. 

SEE: What is quantum computing? Everything you need to know about the strange world of quantum computers

The launch of the AWS Center for Quantum Computing sees Amazon reiterating its ambition to take a leading role in the field of quantum computing, which is expected to one day unleash unprecedented amounts of compute power. Experts predict that quantum computers, when they are built to a large enough scale, will have the potential to solve problems that are impossible to run on classical computers, unlocking huge scientific and business opportunities in fields like materials science, transportation or manufacturing.

There are several approaches to building quantum hardware, all relying on different methods to control and manipulate the building blocks of quantum computers, called qubits. AWS has announced that the company has chosen to focus its efforts on superconducting qubits -- the same method used by rival quantum teams at IBM and Google, among others. 

AWS reckons that superconducting processors have an edge on alternative approaches: "Superconducting qubits have several advantages, one of them being that they can leverage microfabrication techniques derived from the semiconductor industry," Nadia Carlsten, head of product at the AWS Center for Quantum Computing, tells ZDNet. "We can fab large numbers of qubits on a silicon wafer and do it in a repeatable way, and that scalability will be important."

Scaling the hardware is one of the primary areas of focus across the quantum industry. The technology is still in its infancy, with the majority of quantum processors supporting only a few dozen qubits. IBM's most advanced superconducting quantum system, for example, is limited to 65 qubits.

But quantum computers that can solve problems with societal and commercial value will require processors that can support millions of qubits. This is the goal that AWS is setting for itself: the company is promising a "bold approach" that will deliver a system that can execute algorithms requiring billions of quantum gate operations.

"It's a big challenge we're taking on," says Carlsten. "We have to scale quantum systems in size and learn new clever ways to control these bigger systems, but we also have to do this in a way to keep the noise in check so that the error rates are low enough to allow computations requiring very large number of gate operations."

b003-c034-20211012-r00702-nc-min.jpg

The AWS Center for Quantum Computing on the Caltech campus.

Image: AWS

Error rates are one of the main obstacles to scaling today's quantum computers. This is because qubits are very fragile, and any slight perturbations from their surrounding environment can cause them to fall from the special quantum state that powers quantum computers. This phenomenon is known as decoherence and is responsible for the high error rates that riddle the calculations carried out by existing quantum processors.

Classical computers, for instance, experience error rates at the level of one error per billion operations, while quantum computers experience one error in every thousand operations. 

This has given way to the development of a research field known as quantum error correction (QEC), which is dedicated to protecting quantum information from decoherence. One way to carry out QEC consists of using many imperfect qubits (called 'physical qubits') to form one controllable qubit (called the 'logical qubit'), which encodes the quantum information and can be used to detect and correct errors. 

But QEC creates a large hardware overhead in that many physical qubits are required to encode every logical qubit: according to Carlsten, each protected qubit typically requires 1,000 physical qubits. This makes it even harder to build a universal quantum computer comprising large-scale qubit circuits. 

AWS sees research in QEC as the key to solving many of the scaling problems that are crippling quantum computing. "One of the things our team of experts at the AWS Center for Quantum Computing is focused on is how to implement quantum error correction in a way that is hardware-efficient, which drastically reduces the number of physical qubits needed," says Carlsten. 

Earlier this year, the company released its first research paper detailing a blueprint for a new approach to QEC, which the scientists said could improve error correction with less physical qubits. The paper proposed an architecture in which just over 2,000 superconducting components used for stabilization could produce a hundred logical qubits capable of executing 1,000 gates. 

The blueprint was purely theoretical, and many challenges remain to prove that the architecture could take shape as a physical device. Still, Carlsten argues that there is reason to be optimistic. "By cutting down the number of physical qubits required, we're also cutting down on the scale of the supporting systems required to control the processor." 

SEE: Quantum computers could read all your encrypted data. This 'quantum-safe' VPN aims to stop that

From an engineering perspective, therefore, this approach makes a large-scale quantum computer a more realistic proposition.  

It's still very early days for AWS's quantum computing initiative, with most of the company's work in the field still theoretical, and huge challenges lie ahead. Carlsten acknowledges that her teams are only just getting started and that progress will take several years. 

"It is a massive challenge, but one that we think we're well-positioned to take on," says Carlsten. 

Eventually, AWS wants its quantum hardware to be available in the cloud for the company's customers to use on AWS Braket -- a fully-managed and cloud-based quantum platform launched in 2019 that enables customers to access computers from third-party quantum hardware providers. Carlsten, however, did not provide a roadmap or timeline for the company's next quantum achievements. 

This would allow the cloud giant to catch up with IBM and Google, who are both already making their quantum hardware available in their own clouds for customers to use. IBM has now even started deploying quantum computers outside of the lab and directly into select customers' data centers. 

Editorial standards