Categories: Climate

Sphere of influence

People want to know how climate will change in their back yards. But current climate predictions are based on global models, which are limited by their coarse resolution and by difficulties in capturing detailed climate processes. Improving model and process resolution and figuring out how to use global data to project regional effects will become crucial for predicting agricultural yields, water supplies and other climate-dependent factors.

As computing approaches the exascale – the capacity to do a million trillion calculations per second – it begins to be feasible to model regional climate effects using historic data combined with new satellite and weather station information. But to glean realistic results, climate modelers will need a new, efficient approach to bring the key data into focus while minimizing less important components. In other words, climate modeling needs to view earth through a new lens.

Enter Todd Ringler and his colleagues in the Climate, Ocean and Sea Ice Modeling (COSIM) group at the Department of Energy’s Los Alamos National Laboratory. Ringler’s work combines atmospheric science, numerical modeling and more than a passing acquaintance with an historic mathematician named Georgy Voronoi. The Los Alamos group’s task is to find a mathematically simple way to partition the planet uniformly while conforming to its spherical shape.

For historical reasons, global climate modeling has traditionally divided the earth along latitude and longitude. That works fine for navigation but isn’t ideal if the goal is a uniform grid on which to overlay a climate model. “As we all know, longitude lines meet at the poles,” Ringler says. That irregularity can cause problems if a climate forecaster wants to treat each terrestrial unit identically.

The group’s solution to the original grid problem employs a geometrical algorithm first devised by Voronoi, which, by virtue of the simplicity and beauty of its design, also provides an elegant solution for focusing resolution within chosen regions of a global climate model. The method starts with a set of points and defines each grid unit as all the space closer to one point than to any other point. If the original set of points is at a uniform distance, the resulting grid covers the defined space uniformly. But the method allows a subset of points to be mathematically defined as closer together or farther apart. The result is a flexible grid that works well with the atmospheric physics models that define density and flow. The new multiscale model, called the Model for Prediction Across Scales (MPAS), will allow scientists to explore in detail the effect of global climate on regional climate, Ringler says.

Even today’s most powerful computers can’t effectively simulate the effect of ocean processes on large ice sheets and massive ice shelves.

Just as a photographer might change a camera’s depth of field
to focus attention on a single object in the foreground, with objects in the background recognizable but blurry, MPAS allows a climate scientist to focus on regional climate while also maintaining the global data, but at a lower resolution. Scientists who previously wanted to use global climate data to predict regional climate variation have been restricted to a uniform resolution that can’t fully account for peaks and valleys and other local geography. It’s like giving a photographer a fixed focal length camera: He or she can get a snapshot of an area of interest, but the detail might not be ideal.

“This allows climate scientists to place the resolution where they want it,” Ringler says. “It allows locally enhanced resolution that would not otherwise be available given computing resources.”

What’s more, he says, the MPAS approach applies to all components of the climate system. It’s possible to combine global data from the atmosphere, ocean currents, ice and landforms on this single mesh framework. And by focusing computing resources on, say, the North Atlantic Ocean, resolution in that region goes from about a 100 kilometer (km) mesh to a 10 km mesh, small enough to pick up subtleties in ocean circulation, such as the loop current in the Gulf of Mexico.

Even today’s most powerful computers can’t effectively simulate, for example, the effect of ocean processes on large ice sheets and massive ice shelves. The scale needed to resolve ocean-ice interactions is just too fine for current global-scale modeling capabilities. Scientists are working on the problem, but they rely on local and regional climate systems to supply data for their models. The ability to resolve these ocean-ice interactions at the scale needed to predict events 50 or 100 years from now is beyond today’s computing capacity.

Because the Los Alamos approach allows each grid component the flexibility to change its size depending on where the research is focused, computational scientists on the MPAS team have had to start from scratch, writing code to accommodate that flexibility. Currently, the New Mexico group is using the approach to develop the ocean model and ice sheet model. Another group, at the National Center for Atmospheric Research in Boulder, Colo., is developing the atmosphere model. In a couple of years, the MPAS team expects to conduct global coupled climate simulations using the MPAS atmosphere, ocean and land ice components with enhanced local resolution.

Recently, Ringler and his colleagues compared the MPAS version of their ocean model against observational data in a series of 20-year simulations. A simulation that used 15 km resolution in the North Atlantic produced the same representation of the Gulf Stream dynamics as a simulation that used 15 km resolution throughout the ocean domain.

“By using 15 km resolution in the North Atlantic and 80 km elsewhere, we found that the model agrees well with observations of the Gulf of Mexico loop current, Gulf Stream separation and ocean eddy activity in the North Atlantic,” he says.

One of the first sets of questions regional climate modelers would like to ask, Ringler says, is how to predict extreme precipitation events, which are sensitive to cloud cover. To resolve cumulus clouds requires a mesh resolution of 1 km to 10 km, he says. Using full global data to resolve such a problem will require an adaptive mesh such as MPAS. But to run such as model, Ringler and colleagues must first optimize their system for petascale computing and the even faster exascale computing to come.

To date, most of MPAS development support has come from the DOE Office of Science’s Biological and Environmental Research (BER) program. “But right now I would say our biggest challenge, period, is making these models efficient on current and next-generation machines,” Ringler says. That capability will come from advances in projects supported by ASCR, the science office’s Advanced Scientific Computing Research program and by the BER-ASCR SciDAC (Scientific Discovery through Advanced Computing) partnership program that supports developments for MPAS ocean and ice sheets.

“The MPAS is a central theme in our exascale computing initiative,” Ringler says. A traditional model running on an inflexible, structured grid “means access to computer memory is uniform and regular. In the MPAS, the price we pay for a variable resolution is that we have to use an unstructured grid, meaning that access to memory is not regular and is not as predictable. This has profound implications for how we write the algorithms.”

The big challenge for next-generation climate models will be developing what Ringler calls scale-aware physical algorithms. “MPAS allows us to directly simulate important climate processes, such as eddies in the ocean or clouds in the atmosphere, (but) we can only directly resolve those processes in regions of local mesh refinement. In other regions, those processes will need to be modeled indirectly. Figuring out how to transition from low-resolution regions to high-resolution regions will be a huge task.

“Right now we are asking the climate modelers to think about ways to use this new approach to global climate modeling. We want them to bring their creativity to devising new questions to answer.”

Bill Cannon

Share
Published by
Bill Cannon

Recent Posts

We the AI trainers

Computer scientists are democratizing artificial intelligence, devising a way to enable virtually anyone to train… Read More

November 12, 2024

AI turbocharge

Electrifying transportation and storing renewable energy require improving batteries, including their electrolytes. These fluids help… Read More

October 30, 2024

Pandemic preparedness

During the pandemic turmoil, Margaret Cheung reconsidered her career. At the University of Houston, she… Read More

October 16, 2024

A heavy lift

Growing up in the remote countryside of China’s Hunan province, Z.J. Wang didn’t see trains… Read More

August 1, 2024

Frugal fusion

Princeton Plasma Physics Laboratory (PPPL) scientists are creating simulations to advance magnetic mirror technology, a… Read More

July 16, 2024

A deeper shade of green

Niall Mangan uses data to explore the underlying mechanisms of energy systems, particularly ones for… Read More

June 26, 2024