Quantum computing for HEP

The Promise of Quantum Computing for understanding QCD

The Standard Model of Particle Physics encapsulates the vast majority of our understanding of the fundamental forces in our Universe. While it has been incredibly successful, the Standard Model remains incomplete. For example, the existence of Dark Matter, the matter-antimatter asymmetry and neutrino masses, all require physics beyond the Standard Model. Gaining insight into these phenomena is the preeminent challenge facing high-energy physics (HEP) today. Most experiments depend in one way or another on non-perturbative effects in quantum chromodynamics (QCD), complicating the direct interpretation of potential results.

With an inherently different computational strategy, quantum computers hold the promise of simulating quantum field theories, potentially unlocking physics inaccessible to classical computers. This would reshape our understanding of strong interactions within matter and our ability to interpret experimental measurements, allowing for new insights into our Universe.

The LBNL group, led by Christian Bauer, aims to develop the necessary theoretical and algorithmic tools to simulation of non-perturbative QCD dynamics on currently existing NISQ as well as future fault-tolerant quantum computers. Combining insights from effective field theories, lattice field theory, general techniques in quantum algorithms and different quantum hardware platforms, the group has developed several foundational techniques required to allow such quantum simulations.

Hamiltonian Lattice Gauge Theory

The only first-principles approach to calculating non-perturbative effects originating from the strong interaction is lattice QCD (LQCD). Traditional LQCD discretizes space and time onto a hypercubic lattice, and then computes correlation functions in QCD by performing path integrals in this discretized space-time. Since the required integrals are of extremely high dimensionality, LQCD employs Monte Carlo integration, which samples from the space of allowed field configurations. Despite its successes, LQCD has several fundamental limitations. The main reason for these limitations is Monte Calro integration requires a positive definite integrand, which is only obtained when formulating the theory in Euclidean, rather than Minkowski space. Working in Euclidean space severely limits the observables that can be computed within LQCD.

These limitations can be overcome by working with Hamiltonian Lattice Gauge Theory (HLGT). While HLGT was developed more than 50 years ago, a challenge is the exponential growth of the Hilbert space with the number of lattice sites, rendering classical solutions of high-energy processes impractical. Quantum computers have been shown to exponentially more efficient at simulating HLGTs, which has promted a large research effort in this area.

One pillar of the recent work of the LBNL group has been a careful study of different formulations of HLGTs, coming up with a variety of novel bases for the Hilbert space and truncation schemes that allow to simulate gauge theories in previously inaccessible regions. The developments on this front include the loop string hadron basis for both SU(2) and SU(3) gauge theories, magnetic and mixed bases for U(1) and SU(2) gauge theories as well as irrep bases for SU(3).

Recently, the group developed the large Nc expansion for SU(3) Yang Mills theory. This work lays the foundation for a systematic expansion of the QCD Hamiltonian in this limit and showed that at leading order in this expansion the theory dramatically simplifies. Besides constructing the required Hilbert space and Hamiltonian in this limit, the LBNL group showed how to perform such a simulation on IBMs quantum hardware, performing the first simulation of SU(3) gauge theory on a non-trivial two dimensional lattice.

Computing Effective Field Theory Matrix Elements

The LBNL group has  deep knowledge about effective field theory and SCET, and in particular how effective field theories can be used to identify observables that provide quantum advantage relevant for the most important HEP problems. They are one of the only groups in the world who have used EFTs to identify an HEP-relevant observable maximizing the sensitivity to non-perturbative physics for quantum computers, and then proceeded to devise and execute a quantum algorithm on quantum hardware for an actual simulation. While this calculation was developed in a toy model on a rather small lattice, the explicit quantum algorithms were shown to run on IBMQ hardware and produce results in agreement with theoretical expectatations. This will serve as the foundation for further studies.

Quantum Parton Showers

For events with high multiplicity of final state particles, theoretical predictions typically rely on parton shower algorithms. The probabilistic nature of these algorithms severely limits the accuracy with which predictions can be made. In many cases these limitations are not fundamental, but due to calculation complexity growing exponentially with the number of final state particles. Given that quantum algorithms have been shown to provide exponential speedup over classical calculations in many cases, a focus our research has been to study how quantum algorithms can be used to simulate high multiplicity events without exponential growth.

We have shown in simplified models that effects intractable using known classical algorithms can be simulated on quantum computers with only polynomial scaling with particle multiplicity, rather than exponential. A comparison of the complexity of the quantum and classical algorithms revealed that the quantum algorithm outperforms the classical one for more than 10 fermions in the final state. While currently available quantum hardware is not yet able to run the full parton shower, we were able to run a simplified version to show that the quantum algorithm can clearly pick up the interference effects.

Noise Mitigation Schemes

While we ultimately hope to have access to quantum hardware that uses quantum error correction to develiver a fault tolerant quantum computer, currently existing hardware belongs to the class of Noisy Intermediate Scale Quantum (NISQ) devices. Since the execution of any quantum algorithm on NISQ devices gives rise to errors, it is mandatory to develop mitigation schemes to extract reliable result form these devices.  The LBNL group has had significant involvement in the development of such noise mitigation schemes. In particular, they developed certain Zero Noise Extrapolation schemes, as well as the now widely used technique of using noise estimation circuits to obtain information on the errors generated in the execution such that it can be corrected for.

Publications

A complete list of publications of the group can be found using the Inspire HEP database