In scientific simulations, one sure thing is uncertainty. But a researcher at the Department of Energy’s (DOE) Sandia National Laboratories in Albuquerque is studying the nagging nuances of uncertainty to sharpen the predictive capabilities of high-performance computer models.
As the recipient of the DOE Office of Science Early Career Research Program award, Tim Wildey will receive $500,000 a year for five years to improve data-informed, multiphysics and multiscale simulations. He and his Sandia team mainly use the Sky Bridge supercomputer but also expect to enlist Trinity at Los Alamos National Laboratory and two Office of Science user facilities: Titan at the Oak Ridge Leadership Computing Facility and Cori at the National Energy Research Scientific Computing Center.
Uncertainty quantification is a broad science and engineering concept that encompasses a major issue in scientific computing. “There are some kinds of uncertainty that are completely random and can’t be reduced, and there are other types that could be reduced if we actually had some additional knowledge or additional data,” Wildey says.
“From the time researchers began building models decades ago, it became a realization that if we put computational models and simulation alongside experiment and theory, then we need to understand and characterize uncertainty.”
Numerically based uncertainties often come from the assumptions researchers make in building models, Wildey says. They often don’t know the precise model conditions, so they make assumptions and the models go about making predictions.
Take, for example, hurricane forecasting. Storm models on TV don’t show a single track as they did decades ago but instead an ensemble of potential tracks, thanks to ever-more-sophisticated computations. As the simulations run, initial conditions such as wind and precipitation patterns are perturbed to create multiple paths for where the hurricane might hit and at what force. The complete picture of the models portrays a clear image of uncertainty.
Wildey’s models are often multiphysical, characterized by two or more phenomena that interact over space and time. A wind turbine, for instance, can be modeled to reveal its mechanical behavior and to predict various wind forces.
In the future, Wildey’s research could be applied to materials science, subsurface flow and mechanics and magnetohydrodynamics.
For now, he focuses on materials science applications that will employ three-dimensional printing using an additive process, the bottom-up approach of turning a physical system into thousands of individual pieces that eventually become glued together into one object.
“The goal in combining 3-D printing with modeling and simulation is to understand the 3-D printing process and the effect that this manufacturing has on both fine-scale microstructure and bulk-scale performance,” Wildey says.
The research also focuses on modeling and simulating potential fusion energy devices similar to a tokamak, which uses a potent magnetic field to hold plasma in a torus (or donut) shape.
The Tokamak model “has a range of spatial scales that presents a really complicated plasma physics problem,” Wildey says. Additive process modeling, meanwhile, “presents an entirely different spatial landscape.” At the microscale, the 3-D printing technique creates variable behavior in the materials that leads to particular macroscale behavior. “Being able to predict and control that is tremendously important.”
‘Discretizing one simple equation yields millions if not billions of coupled discrete equations.’
Even supercomputers are incapable of precisely solving the partial differential equations that model these physical problems, so Wildey and others apply methods called discretization to gain insight into the physical system. A common discretization tool is finite element analysis, which approximates the solution to the differential equation by finding the best fit to a subspace with finite dimensions.
“Discretizing one simple equation yields millions if not billions of coupled discrete equations, which is why we need high-performance computing to solve these problems,” Wildey says.
To evaluate models’ accuracy, Wildey relies on posterior error estimation, which uses a numerical approximation of the true solution.
“The discretizations have error, and these posterior error estimates seek to use the computed numerical approximations to try to assess the accuracy either of the solution itself, or some quantity of data of interest that is computed from the solution,” he says.
Another uncertainty quantification approach is to use a surrogate in place of a standard Monte Carlo modeling method, in which random variables predict various outcomes’ odds. The surrogate model, comprised of a smaller ensemble of simulations from the real model, serves as a sort of emulation for addressing uncertainty in the whole model.
“Monte Carlo methods are very easy to implement, easy to parallelize and are very robust,” Wildey says. Many of the problems have a structure that can be exploited to make a surrogate model. It’s easy to evaluate the surrogate, but it must be cross-validated to assess its accuracy. “The question then becomes: Is it as accurate performing uncertainty quantification on the high-fidelity model? That’s a key part of our research in this early stage of the award.”
Even while trying to figure out “how discretization errors affect what we’re trying to predict, we’re trying to assess how surrogate approximation errors affect what we’re trying to predict.”
Over the next five years, Wildey expects to make significant contributions to the DOE science mission by showing “we can effectively use extreme-scale computational resources and build data-informed computational models that incorporate information from multiple spatial and temporal scales,” he says. “Then we will use these models to make credible predictions beyond the scope of current computational power.”
Sandia National Laboratories is a multimission laboratory operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energy’s National Nuclear Security Administration.
Now in its eighth year, the DOE Office of Science’s Early Career Research Program for researchers in universities and DOE national laboratories supports the development of individual research programs of outstanding scientists early in their careers and stimulates research careers in the disciplines supported by the Office of Science. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.
The human brain contains a vast expanse of unmapped territory. An adult brain measures only… Read More
Computer scientists are democratizing artificial intelligence, devising a way to enable virtually anyone to train… Read More
Electrifying transportation and storing renewable energy require improving batteries, including their electrolytes. These fluids help… Read More
During the pandemic turmoil, Margaret Cheung reconsidered her career. At the University of Houston, she… Read More
Growing up in the remote countryside of China’s Hunan province, Z.J. Wang didn’t see trains… Read More
Princeton Plasma Physics Laboratory (PPPL) scientists are creating simulations to advance magnetic mirror technology, a… Read More