When we think of the particle accelerators that elucidate the building blocks of nature, we think of spectacular and massive facilities like the 27-kilometer-circumference Large Hadron Collider (LHC), the proton-crashing instrument at CERN famous for the Higgs boson discovery.

But what if there were much smaller alternatives to the giant machines that could probe physics beyond the large colliders’ reach? There just may be such a possibility: laser-plasma accelerators, “a promising candidate to significantly reduce the cost and improve the compactness of beam generators,” says Jean-Luc Vay, a senior physicist at Lawrence Berkeley National Laboratory and head of the Accelerator Modeling Program in the lab’s Accelerator Technology and Applied Physics Division.

Envisioned since the late 1970s, laser-based devices that boost charged particles to near-light speed could be tens to hundreds of times smaller than current machines. With these small, affordable particle accelerators, scientists could conduct experiments on subatomic particles by accelerating particle beams in much shorter distances than now. In addition, by generating extremely intense bursts of light – with remarkable physical systems called plasma mirrors – researchers could explore quantum electrodynamic phenomena beyond the reach of even the biggest particle accelerators like LHC, including neutron stars, black-hole horizons and other violent astrophysical events.

To help develop these futuristic devices, teams led by Vay and Henri Vincenti, a research scientist at France’s Commissariat à l’Energie Atomique (CEA), Université Paris-Saclay, have developed WarpX, an open-source code that simulates plasmas produced when high-powered lasers fire into solids or gases. “In this context,” Vincenti says, “our work consists in using massively parallel simulations to find realistic experimental schemes based on high-intensity laser-plasma interactions.” These designs could help generate relativistic particle beams – fermions, hadrons – with properties at least as good as those from conventional accelerators.

The team has developed WarpX through the Department of Energy (DOE) Exascale Computing Project (ECP). EPC has paved the way for the next generation of high-performance computers, which will be able to conduct one million trillion operations per second. The name WarpX is a combination of “X” for exascale and “warp,” a previous-version code name derived from “warp speed,” the science-fiction concept of faster-than-light travel.

Vay, Vincenti and their colleagues have received a renewal allocation of computing time through DOE’s Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program. The award includes 110,000 node-hours on Summit, Oak Ridge National Laboratory’s IBM AC922 machine, and 600,000 node-hours on Theta, a Cray XC40 machine at Argonne National Laboratory. Summit is the United States’ most powerful high-performance computer, running primarily on graphics processing units (GPUs), and Theta is one of the biggest machines driven by central processing units (CPUs). “Our code was first targeted at CPUs and ported to GPUs recently,” Vay says. “As we are moving to using GPUs routinely for production, Theta provides an excellent platform during the transition.”

The code captures a staggering range of time and space scales.

The INCITE award lets the team explore an effect that can occur when a high-powered laser hits a silica plate. Under the right conditions, silicon ions and electrons are torn from the plate’s surface to form a dense plasma that behaves like a parabolic mirror, or a relativistic plasma mirror, that reflects charged particles at velocities approaching the speed of light. The team will simulate the use of such a mirror to inject electron bunches into high-powered laser beams capable of further accelerating the electrons over distances that are tens to thousands of times shorter than those which conventional accelerators use. Vay says compact electron accelerators could lead to applications in high-energy physics, particle therapy, sterilization of equipment and liquids, materials science, chemistry, biology and pharmacology.

The team also is exploring using relativistic plasma mirrors to produce extreme light intensities. WarpX simulations already have shown that these mirrors could focus light to intensities more than a thousand times greater than today’s most powerful lasers. In optimal laser-plasma conditions, such devices could potentially achieve intensities close to the so-called Schwinger limit, where the quantum vacuum becomes unstable, leading to copious production of electron-positron pairs. Scientists are eager to study such phenomena further because they occur in mysterious, violent astrophysical events such as gamma-ray bursts.

The team faces several challenges in developing WarpX. As a particle-in-cell code, it describes plasma as a mesh of cells, each populated with data describing the plasma inside. Besides temperature, velocity and electromagnetic charge, the code captures a staggering range of time and space scales. Vital events may take a few quadrillionths of a second while others, in the same simulation, can take microseconds. With some events taking one billionth the length of others, the challenge is the equivalent of tracking every second of a 31-year-old person’s life. The lengths also can expand to 1 billion. Modeling micron laser wavelengths in a 1-kilometer accelerator is akin to shrinking the border between California and Nevada down to a millimeter.

This extreme scope makes simulations especially challenging. The team is pioneering ways to apply Maxwell’s equations, which govern how electromagnetic waves (such as lasers) interact with matter, and Lorentz transformations, which slip simulated objects (like ions or electrons) traveling near the speed of light from one type of mathematical grid to another.

The WarpX team’s greatest challenge is to implement these advanced features while developing an ability to focus on simulation events at virtually any level of detail. The goal is to use WarpX to zoom in on the small events and short time scales through adaptive mesh refinement, which works out details in one part of a simulation without crunching the numbers for the full simulation. Here the team also draws on DOE’s AMReX code library, developed through the ECP at Berkeley Lab, Argonne and the National Renewable Energy Laboratory in Golden, Colorado.

“Mesh refinement is particularly difficult to implement in an electromagnetic particle-in-cell code,” Vay says. “There are many numerical issues that need to be addressed. We have studied, and are continuing to study, the various numerical issues at hand independently, and we have developed mitigation strategies. Tests are scheduled to assess the overall performance once all the pieces are in place.”

Bill Cannon

Published by
Bill Cannon

Recent Posts

A deeper shade of green

Niall Mangan uses data to explore the underlying mechanisms of energy systems, particularly ones for… Read More

June 26, 2024

Under life’s hood

The Institute for Computational Biomedicine at Weill Cornell Medicine, Cornell University’s medical college in New… Read More

May 30, 2024

Holistic computing

Bahar Asgari thinks that high-performance supercomputers (HPCs) could run far more efficiently and consume less… Read More

April 24, 2024

Quanta in bulk

Quantum computers hold the promise of processing information exponentially faster than traditional machines — 100… Read More

April 2, 2024

Aiming exascale at black holes

In 1783, John Michell worked as a rector in northern England, but his scientific work… Read More

March 20, 2024

Flying green

Four years after his Ph.D., turbulence and combustion modeler Bruce Perry has drawn a plum… Read More

February 14, 2024