X
Innovation

IBM and CERN use quantum computing to hunt elusive Higgs boson

IBM and CERN are working to find out how quantum machine learning could help understand the fundamental laws of nature.
Written by Daphne Leprince-Ringuet, Contributor
1304098-22.jpg

It is likely that future quantum computers will significantly boost the understanding of CERN's gigantic particle collider.    

Image: CERN / Maximilien Brice

The potential of quantum computers is currently being discussed in settings ranging from banks to merchant ships, and now the technology has been taken even further afield – or rather, lower down.  

One hundred meters below the Franco-Swiss border sits the world's largest machine, the Large Hadron Collider (LHC), operated by the European laboratory for particle physics, CERN. And to better understand the mountains of data produced by such a colossal system, CERN's scientists have been asking IBM's quantum team for some assistance. 

The partnership has been successful: in a new paper, which is yet to be peer-reviewed, IBM's researchers have established that quantum algorithms can help make sense of the LHC's data, meaning that it is likely that future quantum computers will significantly boost scientific discoveries at CERN.  

SEE: Hiring Kit: Computer Hardware Engineer (TechRepublic Premium)

With CERN's mission statement being to understand why anything in the universe happens at all, this could have big implications for anyone interested in all things matter, antimatter, dark matter and so on. 

The LHC is one of CERN's most important tools in understanding the fundamental laws that govern the particles and forces that make up the universe. Shaped as a 27-kilometer-long ring, the system accelerates beams of particles like protons and electrons to just below the speed of light, before smashing those beams together in collisions that scientists observe thanks to eight high-precision detectors that sit inside the accelerator. 

Every second, particles collide approximately one billion times inside the LHC, producing one petabyte of data that is currently processed by one million CPUs in 170 locations across the world – a geographical spread that is due to the fact that such huge amounts of information cannot be stored in only one place. 

It's not only about storing data, of course. All of the information generated by the LHC is then available to be processed and analyzed, for scientists to hypothesize, prove and discover.  

This is how, by observing the smashing particles together, CERN researchers discovered in 2012 the existence of an elementary particle called the Higgs boson, which gives all other fundamental particles mass and was hailed as a major achievement in the field of physics. 

The scientists, until now, have been using the best available classical computing tools to assist in their work. In practice, this means using sophisticated machine-learning algorithms that are capable of streaming through the data produced by the LHC to distinguish between useful collisions, such as those that produce Higgs bosons, and junk. 

"Until now, scientists have been using classical machine-learning techniques to analyze raw data captured by the particle detectors, automatically selecting the best candidate events," wrote IBM researchers Ivano Tavernelli and Panagiotis Barkoutsos in a blog post. "But we think we can greatly improve this screening process – by boosting machine learning with quantum." 

As the volume of data grows, classical machine-learning models are fast approaching the limits of their capabilities, and this is where quantum computers are likely to play a useful part. The versatile qubits that make up quantum computers can hold much more information than classical bits, which means that they can visualize and handle many more dimensions than classical devices.  

A quantum computer equipped with enough qubits, therefore, could in principle run extremely complex computations that would take centuries for classical computers to resolve. 

With this in mind, CERN partnered with IBM's quantum team as early as 2018, with the objective of finding out how exactly quantum technologies could be applied to advance scientific discoveries.

Quantum machine learning quickly came up as a potential application. The approach consists of tapping qubits' capabilities to expand what is known as the feature space – the collection of features that the algorithm is basing its classification decision on. Using a larger feature space, a quantum computer will be able to see patterns and carry out classification tasks even in a huge dataset, where a classical computer might only see random noise. 

Applied to CERN's research, a quantum machine-learning algorithm could sift through the LHC's raw data and recognize occurrences of Higgs bosons behavior, for example, where classical computers might struggle to see anything at all. 

IBM's team proceeded to create a quantum algorithm called a quantum support vector machine (QSVM), designed to identify collisions that produce Higgs bosons. The algorithm was trained with a test dataset based on information generated by one of the LHC's detectors, and was run both on quantum simulators and on physical quantum hardware. 

In both cases, the results were promising. The simulation study, which ran on Google Tensorflow Quantum, IBM Quantum and Amazon Braket, used up to 20 qubits and a 50,000-event dataset, and performed as well, if not better, than classical counterparts running the same problem. 

The hardware experiment was run on IBM's own quantum devices using 15 qubits and a 100-event dataset, and results showed that, despite the noise affecting quantum calculations, the quality of the classification remained comparable to the best classical simulation results. 

"This once again confirms the potential of the quantum algorithm for this class of problems," wrote Tavernelli and Barkoutsos. "The quality of our results points towards a possible demonstration of a quantum advantage for data classification with quantum support vector machines in the near future." 

That is not to say that the advantage has been proven yet. The quantum algorithm developed by IBM performed comparably to classical methods on the limited quantum processors that exist today – but those systems are still in their very early stages. 

And with only a small number of qubits, today's quantum computers are not capable of carrying out computations that are useful. They also remain crippled by the fragility of qubits, which are highly sensitive to environmental changes and are still prone to errors. 

Rather, IBM and CERN are banking on future improvements in quantum hardware to demonstrate tangibly, and not only theoretically, that quantum algorithms have an advantage.   

"Our results show that quantum machine-learning algorithms for data classification can be as accurate as classical ones on noisy quantum computers – paving the way to the demonstration of a quantum advantage in the near future," concluded Tavernelli and Barkoutsos. 

CERN scientists certainly have high hopes that this will be the case. The LHC is currently being upgraded, and the next iteration of the system, due to come online in 2027, is expected to produce ten times as many collisions as the current machine. The volume of data that is generated is only going one way – and it won't be long before classical processors are unable to manage it all. 

Editorial standards