To explore how climate will change decades to centuries from now, scientists need many tools. Chief among them: computation.

“Over the years, two things have happened,” says Warren Washington, senior scientist at the National Center for Atmospheric Research (NCAR). One is simply improving the models’ resolution – the detail and precision. “The other is that the physical processes in the model – for example, the clouds and the land surface and vegetation and aerosols (air-suspended particles or liquid) – have increased the complexity of the modeling.”

Yet even with these advances in technology, climate researchers still must “make compromises on the resolution and how much detail to include in our models based upon our computing resources,” Washington says.

Exascale computers – about 1,000 times faster than today’s most powerful machines – will mean fewer compromises when they come on line later in this decade. But to move climate models to exascale, Washington says researchers “need to change our algorithms and the ways of solving the equations so that the codes are specifically designed for using large numbers of processors.” Those tasks demand significant hard work and creative thinking.

Computing power keeps bringing more capabilities to climate research, but scientists still are seeking better ways to use that power. For example, including finer-scale features in climate models depends not only on better computers but also improved algorithms.

And including those small components is vital. “We just don’t have a clear understanding of how very fine-scale processes affect the large-scale climate,” says Ben Kirtman, professor of meteorology and physical oceanography at the University of Miami.

When it comes to adding more physical features, climate scientists need more than data and increased computational power.

Take, for example, hurricanes. “When we do climate-change research, we use models that don’t produce hurricanes,” Kirtman says. “They produce stuff kind of like hurricanes but not really.” Part of the difference is that hurricanes transport heat upward, and today’s climate-change models “don’t simulate that heat motion correctly.”

Ocean features also could stand some improvement. At present, ocean models “don’t capture how eddies work because the models don’t resolve them,” Kirtman says. This shortcoming creates errors that require large numbers of repetitious simulations to resolve.

With exascale computing, Kirtman says, climate-change models can start to resolve physical properties like ocean eddies – and potentially resolve hurricanes. He and colleagues are studying eddies and how they transport heat from the tropics to the United States. These eddies maintain the Gulf Stream. “Until we get exascale,” he says, “we need lots of years of simulations” to model how these eddies affect heat transfer.

Even with exascale power, climate-change models will not replace weather models for simulating individual hurricanes. Still, exascale computing could help researchers forecast an active hurricane period, explain the relationship between sea ice and hurricane intensity, or predict how carbon dioxide levels in 50 years could affect hurricane patterns.

James J. Hack, director of the National Center for Computational Sciences and the Oak Ridge Climate Change Science Institute, says researchers also need a better understanding of the entire climate system. A couple degrees of global average temperature increase in 2100 might not seem like much to most people, but it could trigger 10-15 degrees of change in some areas.

“That may move storm tracks,” Hack says. “For example, it could trigger large stationary wave patterns in the atmosphere that could set up and sit there for a long time, leading to heat waves.”

He hopes enhanced computing power and a better understanding of the overall climate system will provide more information on what happens at smaller spatial and shorter temporal scales. “We want to understand the way that the mean change is constructed.”

Researchers are including components of that mean change to new models, Washington says. “We’ve added a lot more complexity into our models – for example, the cloud interactions. We’re adding in ice-sheet modeling, and we’re already doing experiments on that. We’re adding in more complex and realistic land-surface processes, including vegetation. All of these introduced a lot of new physics and even biology into our climate models.”

So instead of climate models, Washington calls these earth-system models. With these models running on exascale computers, he envisions new capabilities, including tracking changes in the Gulf Stream over a century.Some of these advances started years ago. For example, Mark Taylor, principal member of the technical staff at Sandia National Laboratories, started working on SEAM (Spectral Element Atmospheric Model) in 1994. An atmospheric model is split into physics and dynamics, or how air behaves as a fluid, Taylor says. The physics part is easy to scale up, but it’s not so easy with the dynamics. Consequently, atmospheric models include a dynamical core, such as SEAM, which fixes the scalability problems, he says. In short, SEAM breaks the atmosphere into finite elements, each composed of a square and a vertical column. Much of the computation gets done within each element, and the elements need only inform each other occasionally. This way, the computation can be spread over a group of processors – with, say, 1-2 elements per processor – and still run fast. The SEAM code, Taylor says, evolved into NCAR’s High Order Method Modeling Environment (HOMME), now a dynamical core option in the Community Earth System Model (CESM).

The CESM will add to the expanding climate simulation dataset that has already been increasing. For instance, the Intergovernmental Panel on Climate Change (IPCC) report in 2005 was based on about 35 terabytes of data.

“That was a lot then but nothing now,” says Dean Williams, computer scientist in the Program for Climate Model Diagnosis and Intercomparison at Lawrence Livermore National Laboratory. The next report will analyze about 10 petabytes. In half a dozen years, Williams says, models will be producing exabytes of data.

When it comes to adding more physical features, though, climate scientists need more than data and increased computational power. Kirtman says current models don’t handle clouds, rainfall and other processes very well because we don’t comprehend them well. “We have lots of basic science understanding that needs to be done.”

To understand simulations, scientists often need to see the results in graphic form. Today’s visualization capabilities for climate modeling, Williams says, rank as “probably fair. Lots of things need to be done.”

One of the key goals is server-side analysis and visualization. “You don’t want to move terabytes of data to the desktop,” Williams says, “because that would saturate the network.” Still, terabytes of data need to be moved between server sites because they might be scattered around the world. “To visualize a terabyte here and there, you need to get it all to one location first.”

Similarly, some of tomorrow’s advances in climate modeling will come from converging information. Someone might want to take a few dozen climate models, run them all with a similar grid of computational data points and average the results, Williams says. But doing so will require moving a lot of data. So for visualization, converging models and other purposes, tomorrow’s systems must also include high-speed networks. A fast network today carries 10 gigabits per second (Gbps). DOE is deploying a 100 Gbps upgrade to its scientific network, known as ESnet, and the next generation should carry terabits.

As the data grow more complex and get shared around the world, climate researchers must explore advanced ways to see the results. New visualization tools also will have impact beyond the research community. “The end user of climate-change research is not just scientists or researchers, but also policymakers and students, even in grade school,” Williams says. “This is a big thing for all of us.”

For so many users, the visualization tools must be simpler.

Current models do a pretty good job of accurately predicting “things that society doesn’t care about,” Kirtman says. For example, he believes current models provide accurate predictions of the global mean temperature in 20 or 50 years.

But perhaps the most important questions are universal and practical: What will rising temperatures and changing climate mean to me?

Bill Cannon

Recent Posts

Connecting the neurodots

The human brain contains a vast expanse of unmapped territory. An adult brain measures only… Read More

December 3, 2024

We the AI trainers

Computer scientists are democratizing artificial intelligence, devising a way to enable virtually anyone to train… Read More

November 12, 2024

AI turbocharge

Electrifying transportation and storing renewable energy require improving batteries, including their electrolytes. These fluids help… Read More

October 30, 2024

Pandemic preparedness

During the pandemic turmoil, Margaret Cheung reconsidered her career. At the University of Houston, she… Read More

October 16, 2024

A heavy lift

Growing up in the remote countryside of China’s Hunan province, Z.J. Wang didn’t see trains… Read More

August 1, 2024

Frugal fusion

Princeton Plasma Physics Laboratory (PPPL) scientists are creating simulations to advance magnetic mirror technology, a… Read More

July 16, 2024