Plotting a course for Applied Math
With the Department of Energy facing a range of scientific and technical challenges in the years ahead, there’s a greater need than ever for computer simulations to illuminate and analyze complex systems. A panel of leading applied mathematics researchers assembled in Berkeley, Calif., during the summer of 2007 to assess the applied mathematics advances necessary to develop these simulations and answer the mission-critical questions DOE faces.
The panel’s 2008 report, “Applied Mathematics at the U.S. Department of Energy: Past, Present and a View to the Future,” identified three major areas in which research is needed:
Predictive modeling and simulation of complex systems. Researchers must make modeling and simulation methods more accurate, predictable and sophisticated. That includes developing mathematical tools for multiscale modeling, which spans many length and time scales, and for multiphysics modeling, which unites different kinds of physical processes. New and improved methods also are needed to incorporate data from observations or experiments into the scientific discovery process.
“If you look at the roadmap of large DOE missions over the next decade, there are two things they are interested in getting out of large-scale computational simulations,” says David L. Brown of Lawrence Livermore National Laboratory, the panel’s chairman. “One is getting more fidelity, which means putting in more physics.” Climate researchers, for instance, want models that more precisely incorporate the effects of clouds.
The other goal is more resolution – a finer-grained picture of the process or space modeled. In the climate model, for example, higher resolution would let researchers focus on how climate change could affect a locality. That requires multiscale modeling to understand how fast or small processes influence long-term or large processes and vice versa.
Techniques also are needed to model large stochastic systems, like those found in fusion reactions and biological processes, to quantify uncertainties inherent in such models and to estimate the probability of rare events.
Analyzing the behavior of complex systems. Having the ability to just model and simulate complex systems isn’t enough, the panel said. Researchers also must develop mathematical approaches for understanding their behavior. Questions such as how well a fusion reactor can contain plasma, whether it is safe to sequester carbon dioxide underground, or how fast the climate is changing and by how much require good mathematical analysis tools.
In fact, simulations alone may not be enough to answer such questions; scientists also need to sift through vast amounts of observational or experimental data to discover essential elements. Accomplishing that goal means developing strategies for using large computers to help collect, organize and analyze potentially huge amounts of scattered, heterogeneous data.
To make such models more useful, the panel said, researchers must improve ways to analyze sensitivity to input changes for complex systems involving many parameters, large data spaces and wide ranges of model behavior. New, efficient uncertainty quantification (UQ) methods also are needed to evaluate inaccuracies accumulated through error, imprecise data, incomplete understanding of the underlying processes and other aspects.
“It’s the question of understanding the range of validity and levels of confidence … for predictions,” Brown says. If policy-makers are to use models to develop policies costing billions of dollars and affecting millions of people, they should know how much confidence to place in them.
UQ research is especially important when modeling complex systems, the panel said, because there are more factors contributing to uncertainty.
Using complex systems to inform policy-making. Improved algorithms and powerful computers are making truly predictive computer models a reality. But policy-makers need tools to understand these models’ effectiveness before they can base decisions on them. Research is needed in three areas.
First, risk analysis to evaluate the chances of negative consequences. Risk analysis research should focus on integrating many information sources, on addressing the needs of multiple stakeholders and on incorporating uncertainty and change as knowledge about systems evolves. It also must forge multidisciplinary teams and a means for them to communicate. Model calibration and validation must be improved and ways must be found to systematically incorporate uncertainty and risk when developing models.
Second, improved optimization methods for determining and characterizing the “best” solutions. Researchers need more efficient, broadly applicable methods to solve a multitude of complex problems, but also must develop specialized methods that take advantage of known structural properties of certain critical problems. Problems that have more than one optimization objective (such as an energy solution that is both inexpensive and nonpolluting) or that have a “multi-level” character (where different constraints are imposed at different levels) also can be important but are difficult to solve and require further research. Research should find new and improved algorithms for stochastic optimization and must develop optimization methods and algorithms that match the memory and architectures of current and future parallel computers.
Third, research into the vexing nature of inverse problems, such as deciphering the underground movements of contaminants, locating oil deposits and understanding astrophysical phenomena – all situations where direct observations are impossible, and so scientific truth must be inferred. More research is needed into capturing in models the ways small perturbations in observations can generate big changes in parameter estimations. Since solving inverse problems also can depend on understanding the physical nature of a phenomenon, the report advocated greater collaboration between mathematicians and application experts. The panel also recommended research into analytical tools for inverse problems, including approaches to understand the connection between a problem’s mathematical structure and uncertainty in the outcome, and into ways to reduce uncertainty or problem dimensions.