Categories: Geoscience

Super shakeup

What scientists don’t know about huge earthquakes can make them tremble.

Despite impressive advances in earthquake monitoring, no one knows how much damage the proverbial Big One will cause when it hits close to a population center in Southern California. That’s because the last huge West Coast earthquakes were more than a century ago, before seismometers were common.

“The problem we face in earthquake science is that we actually don’t have recordings of very large earthquakes (from which) to predict the ground motion of the next one,” says Tom Jordan, director of the Southern California Earthquake Center.

The geological record suggests that huge earthquakes shook the southern part of the San Andreas Fault in 1713, 1614, 1565, 1462 and 1417. Because the intervals are between 50 and 100 years, seismologists calculate that southern California is overdue for an enormous shock.

The San Andreas Fault is “locked and loaded,” says Jordan, who also is a geological sciences professor at the University of Southern California. It’s been more than 150 years since the last quake of 7.9 Richter magnitude or greater, he says – near Parkfield in 1857.

Geological research along the San Andreas has increased the number of past earthquakes along a southern section of the fault. Lidar (light detection and ranging), an imaging technology placed in trenches deep in the Carrizo Plain (about 100 miles north of Los Angeles), is finding signs of earth movements, indicating where large earthquakes have happened. Carbon dating can focus in on the approximate dates.

Jordan and his team have used millions of processor hours on the nation’s largest supercomputers to model the seismic waves large earthquakes generate. In early December, the team was granted 10 million processor hours on Intrepid, the IBM Blue Gene/P supercomputer at Argonne National Laboratory, through the Department of Energy’s Innovative and Novel Computational Impact on Theory and Experiment program.

The modelers aim to forecast big quakes’ ground motion and answer the practical question of whether fault lines tend to run toward or away from population centers.

California, like many earthquake-prone areas of the world, has sedimentary basins filled with soft material that has eroded from mountains. Early settlers tended to establish their largest cities and towns in those flat basins – cities like Los Angeles and San Bernardino.

The soft basins “act like big bowls of jelly” during large earthquakes, Jordan says. Seismic energy in the form of large-amplitude waves is injected into them, rattling around and causing enormous motion in the very areas where people and buildings are most concentrated.

In a recent simulation on DOE’s Jaguar supercomputer at Oak Ridge National Laboratory, 436 billion computational elements represented Southern California and the structure of these sedimentary basins.

The earthquake center has been generating large suites of earthquake simulations to estimate ground-shaking.

“We calculated the waves excited by a magnitude 8 earthquake rupture of the entire southern San Andreas Fault,” Jordan says.

The project ran for 24 hours on 223,000 of Jaguar’s processors, simulating several minutes of seismic activity. Jordan and his team presented the results in November, at the SC10 supercomputing conference in New Orleans.

“Based on our calculations, we are finding that the basin regions, including Los Angeles, are getting larger shaking than is predicted by the standard methods,” Jordan says. “By improving the predictions, making them more realistic, we can help engineers make new buildings safer.”

The chance of a magnitude-8-or-greater earthquake on the southern San Andreas Fault in the next 30 years is just 2 percent, but the impact would be dramatic, especially if ground motion followed the basins toward high-population areas.

“You certainly wouldn’t want to be in that earthquake,” Jordan says. Especially in sedimentary basins, it would cause huge ground motions and last a long time.

Computer science and earthquake science have reached a technological juncture that makes it possible to model those basin effects – and to predict how much damage to cities, buildings and humans a huge earthquake might cause.

Jordan notes that big earthquakes “involve ruptures of hundreds of kilometers of faults. We do the simulations to better understand that rupture process and to model it on these big computers.” Waves propagate outward in complicated ways, influenced by the region’s three-dimensional geological structure. “That’s especially true in the upper part of the earth’s crust – the upper 10 kilometers or so.”

When a fault ruptures, it tends to do so in a particular direction. Energy pulses along the faults in front of those ruptures. Recent work by Jordan’s group shows how those one-directional conduits of energy get funneled through the sedimentary basins.

How much energy? How much damage? And how close to population centers? To answer these vexing questions, Jordan and colleagues must model phenomena on the scale of the smallest seismic waves and over very large regions.

At the supercomputing conference, Jordan reminded his colleagues that it’s unwise to draw too much from a single earthquake scenario. No one can predict, for example, whether a huge San Andreas earthquake will start from the southern terminus, the northern terminus or somewhere between.

It could be on the south, near the Salton Sea –100 miles northeast of San Diego. Bombay Beach, on the east coast of the inland sea, lies directly above the fault.

“If that’s where the fault were to rupture, and if it propagated to the northwest along the fault, the ground motion would funnel into the sedimentary basin between San Bernardino and Los Angeles,” Jordan says. “That strong ground motion can cause real problems. That’s the type of thing we’re trying to model.”

And that’s just one possibility. For reasonable probabilistic estimates of the site, direction and magnitude of the next hugely damaging earthquake, researchers must run many simulations.

“Earthquakes have a lot of variability. We have one particular guess of what an earthquake would look like. We’re trying to model many such scenarios.”

Indeed, the Southern California Earthquake Center used more than 400,000 earthquake scenarios to do probabilistic calculations of how the largest future earthquakes are apt to behave, in terms of ground damage and potential property damage.

To make some sense out of such a huge number of variables, seismologists use probabilistic seismic hazard analysis.

It’s the same technique engineers use to understand how they should design their structures to withstand 50 or 100 years of earthquakes. “It’s couched in probabilistic terms because we don’t know for sure what will happen. We can only guess. But we have to characterize the possibility of it occurring.”

The earthquake center has been generating large suites of earthquake simulations to estimate ground-shaking, taking into account three-dimensional effects, the directivity effect and the sedimentary basin effect.

“We model an ensemble of such earthquakes and attach probabilities to each. We start with an earthquake rupture forecast to select a large number of events that represent the possibilities for future earthquakes. We don’t know which will happen next.”

The seismologists expect to represent earthquakes on the San Andreas Fault with many thousands of realizations. “We do that for every fault, and there are hundreds of faults in Southern California. We’ve done calculations using ensembles of over 600,000 simulations.”

How is that possible, to simulate hundreds of thousands of possible earthquakes, when a single simulation requires so many hours of number crunching?

It’s not easy, Jordan says. “It’s going to require a rather huge amount of computer time. We need petascale computers (capable of 1,000 trillion calculations per second) and the best algorithms.”

The strength of earthquake waves is measured in hertz. A hertz is a unit of frequency equal to one cycle per second. Simulations get tougher when they use a higher frequency. Calculations of 600,000 sources can only go up to 0.3 hertz, or a 3-second period. That can reveal what happens in downtown Los Angeles or in San Bernardino but only at low frequency.

Jaguar reached 2 hertz. It takes 10,000 times longer to run a 2-hertz simulation as a 0.3-hertz simulation. That’s why “trying to move these calculations to higher and higher frequencies requires bigger and bigger computers. The aim is to do many simulations of all possible earthquakes up to a frequency of 2 hertz or higher.”

Jordan’s team also is working on predicting hazards that may follow high seismic activity. Suppose a magnitude-5 earthquake knocks dishes off shelves on a Monday. Can supercomputers predict the probability of major earthquakes hitting close by in the days or weeks to follow?

“We’d like to be able to translate events into seismic hazards. What is it going to mean in terms of the shaking we can expect?”

The approach also can be used to better understand earthquakes of the past. “To know where the earthquake was the strongest, we’ve needed to get equipment to it. We have seismometers, but they’re limited in their distribution. We’re hoping to get a better regional map of the shaking that occurs right after a big earthquake.”

Jordan and Po Chen, assistant professor at the University of Wyoming, are using Intrepid, the Argonne supercomputer, to develop three-dimensional models that forecast the shaking of large earthquakes.

The models come from the analysis of past seismic data. Waves that travel through the ground are sensitive to subterranean structures. “By recording those, we can improve our 3-D models of the region,” Jordan says.

The Southern California Earthquake Center coordinates research among more than 600 scientists and engineers representing a wide variety of fields and talents – code writers, geologists, seismologists, applied mathematicians and other specialists from the San Diego Supercomputer Center and the Pittsburgh Supercomputing Center.

Supercomputing is “one part but a very important part of what we do. It lets us gather new understandings of earthquakes, how active they are, how often they occur. By putting it on computer models, we can come up with bottom-line statements of what kind of ground motion we can expect.”

Bill Cannon

Share
Published by
Bill Cannon

Recent Posts

Connecting the neurodots

The human brain contains a vast expanse of unmapped territory. An adult brain measures only… Read More

December 3, 2024

We the AI trainers

Computer scientists are democratizing artificial intelligence, devising a way to enable virtually anyone to train… Read More

November 12, 2024

AI turbocharge

Electrifying transportation and storing renewable energy require improving batteries, including their electrolytes. These fluids help… Read More

October 30, 2024

Pandemic preparedness

During the pandemic turmoil, Margaret Cheung reconsidered her career. At the University of Houston, she… Read More

October 16, 2024

A heavy lift

Growing up in the remote countryside of China’s Hunan province, Z.J. Wang didn’t see trains… Read More

August 1, 2024

Frugal fusion

Princeton Plasma Physics Laboratory (PPPL) scientists are creating simulations to advance magnetic mirror technology, a… Read More

July 16, 2024