Categories: Special Report

Power play

Raymond L. Orbach had been director of the Department of Energy Office of Science just a few weeks when news from Japan further heightened his interest in expanding the role of computational science in all the office’s mission areas.

On April 20, 2002, reports surfaced that a new Japanese supercomputer, the Earth Simulator, had posted a top speed five times that of the most powerful U.S. computing system.

Dr. Raymond L. Orbach, Under Secretary for Science Department of Energy.

The announcement came like an early-morning wakeup call to the American scientific community: Everyone knew it was coming, but it still was a jolt.

The Earth Simulator, designed to model global climate change, had run a benchmarking program at 35.6 teraflops — 35.6 trillion calculations per second. That was as fast as the combined performance of the United States’ top 20 computers, the New York Times reported.

At the time, the most powerful U.S. computer was the ASCI White at DOE’s Lawrence Livermore National Laboratory — a 7 teraflops machine that occupied the No. 1 spot on the TOP500 list of the world’s fastest supercomputers — until the Earth Simulator bumped it.

What shocked scientists and government officials was the Earth Simulator’s efficiency: It had run at more than 85 percent of its 40-teraflops theoretical peak speed — a far greater percentage than U.S. supercomputers.

In practical terms, the Earth Simulator could, in an acceptable amount of time, model climate at 10-kilometer intervals. The best U.S. computers operated at a scale of 100 kilometers. A smaller scale would have meant using years of computer time to obtain results.

“These guys are blowing us out of the water, and we need to sit up and take notice,” Thomas Sterling, a computer designer at the California Institute of Technology, told The New York Times.

DOE’s Office of Advanced Scientific Computing Research (ASCR) was in the midst of reviewing the department’s top computing facilities when the Earth Simulator news broke. In fact, a subcommittee of ASCR’s Advanced Scientific Computing Advisory Committee (ASCAC) was just finishing a report on the subject.

Plans changed when the draft report was presented to the full committee on May 2, 2002 — less than two weeks after the Earth Simulator announcement.

Orbach was at the meeting. He had already recognized computing’s central role in DOE and the Earth Simulator’s challenge to American science.

Bigger and faster computers are on the way.

Orbach asked ASCAC for a quick response to the Japanese machine.

Less than two weeks later, committee members met with representatives from DOE laboratories, government agencies, universities, and Office of Science program offices.

A report on that meeting noted: “The Earth Simulator allows science that is currently beyond the reach of American scientists. … The challenge that it poses goes well beyond climate and related science: The challenge is to American leadership in computational science.”

It added: “In short, without a robust response to the Earth Simulator challenge, the United States is open to losing its leadership” in a range of pioneering science arenas. With a second-class computational science enterprise, the country also stood to lose the brightest science and technology students.

Scientific computing in the 1990s had grown significantly in DOE’s defense-related programs, such as classified weapons research, but the ASCR budget “has lagged behind in both absolute and relative terms,” the report said. ASCR “has thus been put in the position of attempting to implement its plans to serve a need central to all aspects of the Office of Science missions with severely inadequate funds.”

The report concluded: “The Office of Science should embark on an aggressive, integrated program to regain and to sustain leadership in the areas of computational science important to the DOE mission.”

Besides building fast machines, however, the full ASCAC recommended that DOE also focus on scientific computing research to optimize their use.

In a May 29, 2002 memo to Orbach, the group recommended a computational science program to foster collaboration between researchers who use the machines and the computer scientists and mathematicians who make them work.

“Since science is the ultimate driver, the initiative should have a set of urgent, challenging, and exciting scientific goals that build from the strengths of the base programs,” the memo said.

Orbach embraced the recommendations. In speeches and congressional testimony, he touted computer simulation as the new “third pillar” of scientific discovery, joining experimentation and theory.

In 2003, he told the House of Representatives Committee on Science “Advanced scientific computing is indispensable to DOE’s missions.” Every DOE office, Orbach said, had research that only scientific computing could address.

“This will require significant enhancements to the Office of Science’s scientific computing programs,” he testified. “These include both more capable computing platforms and the development of the sophisticated mathematical and software tools required for large-scale simulations.”

The five-year Scientific Discovery through Advanced Computing (SciDAC) program would handle the mathematical and software research to create efficient simulation programs. The $60 million-per-year effort was deemed successful, and with Orbach’s backing SciDAC awarded a second set of 30, five-year grants, starting in 2006.

At the same hearing, Orbach announced the new Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program.

INCITE represented a different approach to computational science. Instead of parceling out small amounts of computer time to many projects, a peer-review process grants millions of supercomputer processor hours to just a few projects — including ones originating with industrial and academic researchers — that hold promise for major scientific advances.

For jetliner company Boeing, an INCITE award provided the chance to test and validate computer codes for aircraft simulations.

“Once this has been successfully completed Boeing then makes the investment in hardware so that these new tools and processes can be used to design products,” researcher Moeljo Hong wrote to program managers.

The response from researchers was so strong that requests for computer time under INCITE outstripped the resources available, driving home the need for additional machines.

In 2003, the Office of Science made increasing computing capability second only to ITER, the international fusion energy reactor, on its Facilities of the Future of Science list. The list prioritized large-scale science projects over the next 20 years, with top-ranked projects scheduled for near-term attention.

In May 2004, the department announced its Leadership Class Computing program. Its goal: build the world’s fastest computer for open science — capable of a sustained 50 teraflops.

The five-year program is nearing completion and already has exceeded that benchmark. Jaguar, a combined Cray XT3/XT4, was installed at Oak Ridge in 2006. In June 2007 it ranked No. 2 on the TOP 500 list at 101.7 teraflops.

A 2007 assessment of Facilities for the Future priorities found high-performance computing had made significant progress and was on schedule.

But even bigger and faster computers are on the way.

The results have won Orbach and the Office of Science accolades.

At a March 2007 hearing, Rep. Dave Hobson (R-Ohio) recalled initial estimates saying the United States wouldn’t catch the Japanese lead in computing until 2010. By 2006, American computers had already more than doubled the Earth Simulator’s speed.

“This wouldn’t have happened without your vision and your persuasiveness,” Hobson told Orbach.

He added: “You’ve probably left a lasting position in this country that wouldn’t have been there for the future had you not come forward and said we’ve got to do this. … Probably 20 years from now people are going to understand what you did.”

But other countries aren’t standing still, either. Japanese research agency RIKEN, for example, wants to build a 10-petaflops computer by 2012.

To maintain the U.S. edge, DOE has launched Simulation and Modeling at the Exascale for Energy, Ecological Sustainability and Global Security. It will prepare for the advent of exaflops computers — capable of 1 quintillion calculations per second — in the next decade. A quintillion is a 1 followed by 18 zeroes.

INCITE and SciDAC will investigate ways to maximize the performance of these new, powerful computers. They’ll allow scientists to run bigger, more accurate models — and position the United States to sustain its computing leadership and economic competitiveness.

Krell Admin

Share
Published by
Krell Admin

Recent Posts

Connecting the neurodots

The human brain contains a vast expanse of unmapped territory. An adult brain measures only… Read More

December 3, 2024

We the AI trainers

Computer scientists are democratizing artificial intelligence, devising a way to enable virtually anyone to train… Read More

November 12, 2024

AI turbocharge

Electrifying transportation and storing renewable energy require improving batteries, including their electrolytes. These fluids help… Read More

October 30, 2024

Pandemic preparedness

During the pandemic turmoil, Margaret Cheung reconsidered her career. At the University of Houston, she… Read More

October 16, 2024

A heavy lift

Growing up in the remote countryside of China’s Hunan province, Z.J. Wang didn’t see trains… Read More

August 1, 2024

Frugal fusion

Princeton Plasma Physics Laboratory (PPPL) scientists are creating simulations to advance magnetic mirror technology, a… Read More

July 16, 2024