Physicists Christian Bauer, Marat Fretis and Benjamin Nachmann of Lawrence Berkeley National Laboratory have taken advantage of an IBM Q quantum computer to capture part of the computation of two protons colliding through the Oak Ridge Leadership Computing Facility’s Quantum Computing User Program. Calculations can show the probability that an outgoing particle will emit additional particles.
In the team’s recent paper, published physical review paperIn this article, the researchers describe how they used a method called effective field theory to break down their complete theory into components. Ultimately, he developed a quantum algorithm to allow the computation of some of these components on a quantum computer while leaving other computations for classical computers.
“For a theory that is closer to nature, we showed how it would work in principle. Then we took a very simplified version of that theory and did an explicit calculation on a quantum computer,” Nachman said.
The Berkeley Lab team aims to uncover insights about nature’s smallest building blocks by observing high-energy particle collisions in laboratory environments, such as the Large Hadron Collider in Geneva, Switzerland. The team is tracing what happens in these collisions, using calculations to compare predictions with actual collision debris.
“One of the difficulties of these types of calculations is that we want to describe a large range of energies,” Nachman said. “We aim to characterize the highest-energy processes into the lowest-energy processes by analyzing the associated particles that fly into our detector.”
Using a quantum computer alone to solve these types of calculations requires many qubits that far outweigh the quantum computation resources available today. The team can calculate these problems on classical systems using approximations, but they ignore important quantum effects. Therefore, the team aimed to separate the computation into parts that were suitable for either classical systems or quantum computers.
The team conducted experiments on the IBM Q through OLCF’s QCUP program at the US Department of Energy’s Oak Ridge National Laboratory to verify that the quantum algorithms reproduced the expected results on the small scale that is still needed for classical computers. Can be calculated and confirmed with
“This is an absolutely significant performance problem,” Nachman said. “For us, it’s important that we describe the properties of these particles theoretically and then actually apply a version of them to a quantum computer. When you move to a quantum computer there are a lot of challenges that are not theoretically Our algorithm scales, so when we get more quantum resources, we’ll be able to do calculations that we couldn’t do classically.”
The team also aims to make quantum computers usable so that they can perform the kind of science they hope to do. Quantum computers are noisy, and this noise introduces errors in calculations. Therefore, the team also deployed error mitigation techniques that they had developed in previous work.
Next, the team hopes to add more dimensions to their problem, dividing their space into smaller numbers and increasing the size of their problem. Eventually, they hope to perform calculations on quantum computers that are not possible with classical computers.
“The quantum computers available through ORNL’s IBM Q agreement have about 100 qubits, so we should be able to reach larger system sizes,” Nachman said.
The researchers hope to relax their guesses and move on to physics problems closer to nature so that they can make calculations that do more than just a proof of concept.
The team performed the IBM Q calculations with funding from the DOE Office of Science in High Energy Physics as part of the Quantum Information Science Enabled Discovery Program (QuantiSEAD).