Paul Fischer is plugging in something that one day could help millions of others tap into safe, clean nuclear energy.

Fischer, a researcher at the Department of Energy’s Argonne National Laboratory near Chicago, is creating computer codes that will help simulate the next generation of nuclear reactors.

What Fischer and his fellow researchers simulate is “just one small part of the entire reactor,” he says.  “Ultimately, designers want to be able to simulate every part.” To do that, however, designers must make many assumptions.

Fischer wants to eliminate some of those assumptions.  It’s important, because most simulations are based on experimental data dating back to nuclear power’s heyday 30 years ago.

Most of the simulation codes also weren’t designed for parallel computing, which splits up a task among multiple processors to reach an answer more quickly.

Concerns about global climate change and fossil fuel supplies have led to greater interest in a new kind of nuclear power plant.  They’re expected to be safer, to produce more power, and to produce as much as 100 times less waste than today’s reactors.

What’s more, proposed “fast” reactors will use waste recycled from today’s nuclear plants.  They’ll economically produce power and consume waste from current-generation reactors, thus reducing the burden on repositories where waste would be held deep underground.  These recycling reactors rely on different cooling mechanisms than current-generation reactors.

That’s where Fischer comes in.  With a grant of 1 million processor hours from DOE’s Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, he and his fellow researchers are developing computer simulations of heat transfer in coolant flowing around tightly packed nuclear fuel rods.

It takes 10 million data points to simulate heat transfer for one pin.

If that isn’t difficult enough, the proposed design throws in a few added twists – literally.

Fischer’s research will contribute to the Global Nuclear Energy Partnership (GNEP), a DOE-led international collaboration to create small, safe and secure nuclear power plants.

The fast-burning reactor designs that are most promising for GNEP’s goals will put more than 200 “pins” of nuclear fuel in hexagonal containers.  A reactor may include hundreds of these containers.

This visualizes a simulation of temperature variation in turbulent coolant flowing around a single wire-wrapped nuclear fuel pin. Red is hot; blue is relatively cool.

Each pin, about a centimeter in diameter and about 1.2 meters long, will have a wire twisted around it in a gentle spiral to separate the pins and promote coolant flow around them.

The new generation nuclear plants will operate at temperatures much higher than today’s reactors.  And instead of water, they’ll use liquid metals such as sodium, or a liquid fluoride salt to cool them.

The computer algorithms Fischer and fellow researchers Carlos Pantano of the University of Illinois and Argonne’s Andrew Siegel are designing simulate reactor core cooling.

As you might guess, it’s tricky.

“For various reasons, the temperature might not be uniform across the subassembly” of fuel rods in a container, Fischer says.  Subassemblies near the reactor’s outside edge tend to be a little cooler, for instance.

Stirring the coolant within each subassembly will lessen that difference.  “Ideally you’d like that to be very well stirred, and that’s the kind of question we can answer” with simulation, Fischer adds.

Computer simulation also can give designers clues about how many times the wire should twist around the pins and how the pins should be spaced for effective cooling, Fischer says.

Fischer says the code his group is devising demands too much computational power for practical use in design programs.  Instead, it will set standards for less demanding codes and will be compared against reactor experiments.

“With these high-fidelity simulations, we can with great confidence explore regions that have not been explored experimentally,” Fischer says.  “We could get to data points that are different from the current designs.”

This is a simulation of coolant flow velocity over a single wire-wrapped nuclear fuel pin in a hexagonally packed array. It shows the presence of low-speed streaks induced by turbulence and extending away from the surface into the main flow.

The unusual geometry created by the wire wrap, however, makes the simulation difficult.  On top of that, the fluid moving past the rods typically is highly turbulent, and the large number of rods is hard to simulate without demanding enormous computer resources.

“We have this wire and the flow is skewed with respect to the wire,” Fischer says.  “What we’re doing is setting up some rather simple calculations that basically look like a channel, plus a wire.”

To calculate what’s happening, Fischer’s group uses a spectral element algorithm.  In essence, it sets up a grid of data points in the area being modeled and calculates what’s happening at each point at a particular time.

Researchers will compare the detailed calculations with other simulations that eliminate small phenomena in the fluid to cut the demand for computer resources.  The comparisons will show whether the less demanding codes accurately depict heat transfer and fluid flow.

Conserving computer power is important.  It takes 10 million data points to simulate heat transfer for one pin.  Simulating a container of 200 pins would require 2 billion grid points – an enormous job, even for a supercomputer.

Fischer figures seven pins will require calculating about 100 million data points.  That should be possible with the 1 million processor hours INCITE has allotted to the project.

It’s possible to run a program using a million processor hours in just a few days or weeks, rather than more than a century, because parallel computing parcels out the job to thousands or tens of thousands of processors in one high-performance computer.

This visualizes a simulation of a nuclear fuel assembly made of seven “pins,” each with a wire spiraling around it counterclockwise as it comes out of the page. The colors show the distribution of fluid velocity in the coolant surrounding the pins, with red indicating the highest velocity and yellow, green or blue a lower velocity.

As a warm-up, Fischer’s group used 1,000 processors to simulate seven pins.  The calculations used fewer processors and less time because the simulation involved fewer grid points and a modest level of turbulent coolant flow.

“You always start with the cheap calculations and work your way up,” Fischer says.  The less demanding simulations will be a baseline, giving a head start to more detailed and turbulent simulations.

The researchers plan to use 40,000 processors to simulate 19 pins by the end of summer 2007.  Next year, they’ll use Argonne’s Blue Gene/L computer, with 140,000 processors, giving them even more capacity.

“One pin today, 100 pins next year,” Fischer says.  Ultimately, the simulation could model thermal transfer among 217 or 271 pins, depending on the configuration.  That’s comparable to the state of the art in experimental data, Fischer says.

“We never will study in detail all the pins in a reactor,” given that there are thousands of pins housed in hundreds of containers, Fischer says.  Nonetheless, results indicate the group’s algorithms are sufficient to accurately characterize heat transfer with 10 million grid points per pin.

That’s why I’m fairly confident we’re going to be able to deliver for them accurate … flow predictions – as good as they can get from the experiments,” Fischer adds.  “I don’t think we’re going to get that out of this INCITE proposal, but as this program progresses we will.”

Bill Cannon

Share
Published by
Bill Cannon

Recent Posts

Quanta in bulk

Quantum computers hold the promise of processing information exponentially faster than traditional machines — 100… Read More

April 2, 2024

Aiming exascale at black holes

In 1783, John Michell worked as a rector in northern England, but his scientific work… Read More

March 20, 2024

Flying green

Four years after his Ph.D., turbulence and combustion modeler Bruce Perry has drawn a plum… Read More

February 14, 2024

Refining finite elements

Want to simulate airflow around an airplane? Mathematician Brendan Keith can walk you through it.… Read More

January 23, 2024

The forecast calls for stats

Extreme weather events have become more common and more intense than ever. In the past… Read More

December 19, 2023

Diversity included

The SC23 meeting this month in Denver is highlighting diversity in high-performance computing (HPC). The… Read More

November 14, 2023