Fluid Dynamics
January 2011

Dynamic adaptations

A University of Colorado researcher has modeled blood, wind and other flowing fluids by adapting computational grids to areas where they’re needed most.

The University of Colorado’s Kenneth Jansen is enlisting innovative computational methods to explore fluid movement over and through complex structures. His work provides new ways to study wind turbine and nuclear reactor design and to build 3-D models of blood flow. See the sidebar for a related video.

Wind turbines, blood vessels and nuclear power plants might seem to have little in common, but their interactions with fluids – air, blood and coolant – present problems that computational research can address.

For 20 years, Kenneth Jansen, professor of aerospace engineering sciences at the University of Colorado, has blended methods that look at the complex structures of solids with those that explore the complex movement of fluids.

The result: new computational methods Jansen has been adapting to take on increasingly intricate problems. He’s also invested in the newest wave of near-exascale computing, finding ways to harness the power of nearly a million parallel processors as efficiently as possible.

Jansen was first exposed to computational solid dynamics and computational fluid dynamics as a University of Missouri undergraduate. At that time, in the 1980s, computational solid dynamics methods allowed researchers to model the nuances of complex domains, while liquid fluid dynamics could model complex movement of fluids such as liquids and gases in simple domains. Real systems, though, typically have both complex shapes and complex flow. As he moved to graduate work at Stanford University, Jansen latched on to the idea of integrating these two sets of methods and using computation to solve challenging problems that couldn’t be approached in other ways. As he built computational code that could solve a growing variety of problems, he began to think about specific applications.

As a postdoctoral researcher, some of these applications, such as aeronautics, emerged naturally from Jansen’s position at Stanford and NASA’s Ames Research Center and later at Rensselaer Polytechnic Institute (RPI). But he also moved into less obvious areas such as cardiovascular flow, a model that combines the structural complexity of a vascular network with differential flow patterns of blood pumped by a heart. Having a tool to predict problems in these systems before they become critical could be a boon to surgeons and, for instance, someone with an aortic aneurysm. An aneurysm is a stretched area of the large arteries that often occurs without symptoms. A rupture often is fatal.

Using data from MRI images of a patient, Jansen can construct a 3-D model of an individual set of blood vessels. A catheter inside the blood vessels provides basic information about flow. From there, he divides the 3-D geometry it into small regular shapes – tetrahedrons – where the points and edges serve as boundaries to understand pressure build-up at locations throughout the blood vessel.

Spot high-resolution

Jansen’s innovation, though, is devising an unstructured grid. In his simulations, the tetrahedrons are different sizes at different points in the domain, depending on which parts of the vessel require higher resolution to understand blood flow. As a result, long straight vessel walls might be represented by geometries with larger dimensions, whereas curves and branching vessels require a smaller grid to calculate details. This system is adaptable; it provides an initial guess, then does computations and adjusts the mesh through several rounds of simulated heartbeats to ensure that the finest regions of the mesh are applied at points where detail is most necessary.

By providing high resolution where it’s needed, Jansen’s method maximizes both accuracy and efficiency. Higher resolution throughout the system would be just as accurate, he says, but it could take much more time or more computing power to get the same answer.

“What we’ve proven over and over in our publications is that it’s a net win,” Jansen says. “Adaptivity takes time, but it saves overall time and uses fewer points. You’re fine where you need to be but not fine everywhere.”

The same computational methods could help researchers design the next generation of wind turbines. One problem the giant pinwheels face is an environment with unpredictable breezes, gusts and eddies. These constantly changing conditions can strain the turbine’s gearbox and cause damage.

But by carving a slit in the turbine blades and adding a disk that adjusts up and down, engineers have developed so-called synthetic jets that modulate the lift and could also alleviate gearbox strain. The computational methods in Jansen’s toolkit allow him to calculate how and why those adjustments work and provide the optimal lift under different wind scenarios. With this added information, it may be possible to program next-generation wind turbines to adjust these jets to even out lift based on information from a wind sensor.

The focus on efficiency is not just about making maximal use of computational resources. With some questions, such as when an aneurysm might burst, time is of the essence. Another application that requires a quick answer: safety scenarios in nuclear power plants. In this situation, Jansen is modeling the interaction of liquids with gases. For example, if a fuel rod develops a crack, researchers need to understand how the uranium fuel might interact with high temperature gases and coolant at that interface.

Applying computational heft

The complexity of his research questions means Jansen’s work can brush against computational limits. He has been awarded millions of processer hours through the Department of Energy’s INCITE (Innovative and Novel Computational Impact on Theory and Experiment) program. As one of the first groups with access to Argonne National Laboratory’s next-generation IBM Blue Gene/Q system, Jansen and his colleagues will study methods that allow researchers to enlist exascale resources – computers capable of a million trillion calculations per second.

One important part of that work, he says, will be in-situ processing. As parallel computers near 1 million processors, they will generate data faster than researchers can process and interpret it. So instead of working sequentially, simulation code will have to be intrinsically connected with visualization code. Jansen and his colleagues already have a demonstration of these methods, and they’re working hard to prepare it for use.

“The idea,” Jansen says, “is to interact with the simulation and mine the data on the fly, monitor it, change the view and learn about it as you go.”