February 2011

Gathering ponders applying extreme computing to physics tasks

Participants in the workshop “Scientific Challenges for Understanding the Quantum Universe and the Role of Computing at the Extreme Scale” identified five broad areas in which exascale computing can significantly advance discovery.

Cosmology and Astrophysics Simulation. Researchers build huge simulations to help decipher the nature of the cosmos and the physics governing it.

One major mystery is the dark universe – the invisible dark matter and dark energy that comprise more than 96 percent of the cosmos. Dark energy is behind the accelerating expansion of the universe; dark matter is believed to act gravitationally to keep galaxies from flying apart.

DOE backs astrophysical simulation through its INCITE (Innovative and Novel Computational Impact on Theory and Experiment) program, which allocates millions of processor hours on the world’s fastest computers. One project models the structure and distribution of dark matter in and around a galaxy similar to the Milky Way.

Even so, cosmological and astrophysical simulations often are hamstrung by computers’ inability to simultaneously portray large-scale and small-scale phenomena. “The data are, frankly, outstripping the simulations,” says workshop co-chair and Stanford professor Roger Blandford.

Astrophyics Data Handling, Archiving and Mining. Simulations and experiments generate an overwhelming amount of data for storage, dissemination and analysis.

“In the 21st Century much of scientific computing is increasingly revolving around data,” says Johns Hopkins astronomer Alex Szalay, who headed a workshop session on the subject. His research group, for example, has 1.2 petabytes of storage – more than a million gigabytes – and can analyze datasets of around 100 terabytes. That’s enough for some simulations and for the Sloan Digital Sky Survey (SDSS), a multiyear effort to capture the sky from North America in exacting detail.

“For the next generation of cosmology, we would need datasets of about 500 terabytes to really maximize the impact” of simulations, Szalay says. He’s building a system with 5.5 petabytes of capacity.

Even that’s puny compared to what’s to come from the Large Synoptic Survey Telescope (LSST), just one of several planned instruments. The telescope will image the entire sky in tremendous detail once every few nights, quickly generating 100 petabytes of data.

DOE’s Scientific Discovery through Advanced Computing (SciDAC) program already is addressing data storage and dissemination, but challenges still abound. “If we don’t have this computing capability, we’ll not exploit the full gamut of results from this telescope” or other facilities and simulations, Blandford says.

Accelerator Simulation. Colliders like the LHC can cost billions, involve complex physics and must be precisely tuned. Computer simulation lets engineers and physicists tweak designs without building expensive and time-consuming prototypes. High-performance computers also control the instruments’ operations.

One example is the International Linear Collider. Scientists around the world, including a team at SLAC, are collaborating on designing the facility, which is expected to stretch more than 19 miles and will slam electrons and positrons into each other. “All the preliminary work will be done computationally,” Blandford says. “It’s a lot of hard work and the design effort requires cutting-edge computation.”

High Energy Theoretical Physics. High-performance computers are letting physicists directly explore the validity and range of the Standard Model. These calculations breathe new life into experimental findings, allowing increasingly precise tests of Standard Model predictions. But perhaps the most exciting goal is to explore new theories Beyond the Standard Model (BSM), Columbia University’s Norman Christ says – “strongly coupled field theories where one dreams of seeing potentially new phenomena or certainly understanding in a different context” already familiar ones.

“There are a lot of ideas – so many that it’s very hard” to commit to studying a particular one, since it can take multiple researchers years of work. Exascale computing would trim that to weeks, making it easier to test many hypotheses.

DOE already has granted more than 300 million processor hours for lattice QCD calculations on supercomputers at Argonne and Oak Ridge national laboratories under INCITE. In fact, Christ says, the USQCD collaboration, an effort joining most U.S. lattice QCD researchers, was one of the first users on ntrepid, Argonne’s IBM Blue Gene/P machine.

Experimental Particle Physics. The realm of the LHC and other huge accelerators, experimental particle physics seeks to understand matter’s building blocks and how they interact. Satellites high above earth and detectors deep underground also seek evidence scientists hope will provide the full picture of nature’s fundamental forces and particles.

The LHC’s main goal is to detect the Higgs particle or Higgs boson, the only constituent posited in the Standard Model that’s yet to be quantified. But it also opens the door to exploring BSM phenomena – including the nature of dark matter.

That’s the main interest of University of California, San Diego, physicist Frank Würthwein. Having the LHC “by no means guarantees that we will actually be able to produce dark matter.” Yet, even if it doesn’t surface, it’s possible something else, unexpected, will.

When the LHC is running at top power it will collide thousands of particles – protons or lead ions – every second, while devices like the compact muon solenoid (CMS), the experiment Würthwein works on, detect the parts they break into. The detectors will discard most collision outcomes as routine but still will record around 10 petabytes – 10 million gigabytes – each year. By the end of the decade, physicists will face datasets on the order of an exabyte – 1,000 petabytes.

That’s “ferocious amounts of data that have to be husbanded and made available for ongoing study,” Blandford says. “This is a huge computational exercise.”