February 2009

Power from plants

Deploying massive computer resources to convert stems and leaves to sugars for fuel.

Jeremy Smith and his fellow researchers have new fuel for understanding the barriers that keep biomass from becoming an economical source of ethanol.

Smith, director of the Center for Molecular Biophysics (CMB), an Oak Ridge National Laboratory-University of Tennessee (ORNL-UT) joint project, and his colleagues create computational models of lignocellulose. Those models could help us understand what makes this tough biomass component so difficult to break down into sugars for conversion to ethanol.

If researchers can overcome lignocellulose recalcitrance, plant stems and leaves and wood chips could become a major feedstock for the ethanol America needs to cut greenhouse gas emissions and foreign oil imports. Currently, most ethanol is produced from food crops like corn, which net smaller greenhouse gas reductions than biomass conversion and interfere with food supplies.

To run its models, the group has an allocation of 6 million processor hours through the Department of Energy (DOE) 2009 Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program. It’s a renewal of a 2008 INCITE grant of 3.5 million hours.

The CMB has projects in DOE’s BioEnergy Science Center (BESC), an interdisciplinary coalition of experts from ORNL, UT and other universities, corporations and the National Renewable Energy Laboratory (NREL). BESC, which is supported by the department’s Office of Biological and Environmental Research, is designed to achieve breakthroughs in biofuels from lignocellulosic biomass.

Both grants have provided time on what is now the world’s most powerful computer for unclassified research, ORNL’s Jaguar. The computer is a Cray XT that was upgraded in 2008 to a peak processing speed of 1.6 petaflops – 1.6 quadrillion calculations a second. Now Smith’s group has snagged another 20 million hours on Jaguar as part of an “early user” program to run high-impact science projects in the supercomputer’s first six months of operation.

With that much time on such a powerful computer, “We can simulate the systems we’re interested in in more exquisite detail, with higher accuracy,” says Smith, who also is the first UT-ORNL Governor’s Chair. “At the same time we can simulate larger systems and for longer times.”

‘Actually running the calculations – that’s the easy part.’

Running a single processor for 1 million hours would take about 114 years. But massively parallel computers like Jaguar break big tasks into smaller parts and parcel them out to thousands of computer processors. The processors work on the pieces simultaneously and their output is assembled into a solution.

Jaguar has more than 181,000 processing cores. When all are operating at once, it can burn through 1 million processor hours in about six hours. Using 20 million hours would be “like having the machine to yourself for a week,” Smith says – although that’s unusual because Jaguar typically runs several jobs at once.

With less time on less powerful computers, the biomass researchers were able to model processes involving just a few thousand atoms for just a few nanoseconds – billionths of a second. The new grants of computer time will enable models involving millions of atoms for up to a microsecond, or a millionth of a second.

“We need to understand the recalcitrance of biomass,” Smith says. “That involves physical processes on different time and length scales, and we don’t yet know what these processes are. Basically, the larger the systems we can simulate in finer detail, the more chance we have for capturing the processes that give rise to biomass resistance.”

As it stands, beating that resistance is costly. To release its sugars, lignocellulose must be “cooked” at high temperatures or treated with chemicals. Some enzymes can help hydrolyze cellulose and convert it into sugars more efficiently, but only if they can get to it. Sugars in cellulose are packed in compact, partially crystalline fibrils that resist enzymes’ efforts to break them down.

On top of that, the fibrils are coated with polymers such as lignin – a complex substance that helps make plant cell walls stiff.

Besides presenting a barrier, lignin also may inhibit enzymes by attaching to their cellulose-binding components. Removing lignin from biomass actually increases the cellulose-hydrolysis yield from 20 percent to 98 percent. CMB postdoctoral researcher Loukas Petridis has been performing extensive lignin simulations.

The models Smith’s group is building will examine the molecular basis of lignocellulose recalcitrance and test how enzymes interact with the substance and each other. Two of Smith’s postdoctoral research associates are studying the enzyme systems. One of them, Moumita Saharay, is working with Hong Guo, a UT associate professor, to calculate cellulose hydrolysis reaction mechanisms. These calculations, based on molecular dynamics and quantum mechanics, are revealing how enzymes break down cellulose into sugars.

Another postdoctoral researcher, Jiancong Xu, is collaborating with Mike Crowley of NREL to perform molecular dynamics simulations of cellulosomes – enzymatic protein complexes bacteria produce to break down cellulose.

The researchers also are comparing their models against neutron scattering experiments conducted at two ORNL facilities: The Spallation Neutron Source (SNS), which generates the world’s most intense pulsed neutron beams; and the High Flux Isotope Reactor (HFIR), a nuclear reactor-based neutron source. DOE’s Office of Basic Energy Sciences supports both.

How neutrons scatter when they strike protein molecules provide clues about how atoms are arranged in those molecules. Instruments that detect neutron scattering “are designed for looking at different aspects of materials,” Smith says. Working with members of the ORNL Center for Structural Molecular Biology, “We’re applying several of them to look at biomass, pretreated biomass and enzymes that interact with biomass.”

The computer models will attempt to interpret the results. The experimental results, in turn, are being used to tweak and improve the computer models.

The researchers have had to overcome some obstacles, including making their simulation codes run well on massively parallel computers. That’s taken a big time investment, Smith says, with much of the work carried out by Benjamin Lindner and Roland Schulz, graduate students in genome sciences and technology, a joint UT-ORNL program.

The group’s work has yielded interesting results, Smith says. For instance, Xu and Crowley’s project already has provided data on how the various modules of cellulosomes interact with each other. The information should help guide protein engineering to design more efficient systems.

More breakthroughs are in the offing and awaiting publication as the researchers analyze results from their first 3.5 million-hour INCITE grant and prepare to use their latest allocations.

“Actually running the calculations – that’s the easy part,” Smith says. “Then you have this mass of data that comes off the computers that you have to analyze.”

How big must a simulation be to provide a solid, accurate portrayal of lignocellulose, enzymes and their interactions? “There’s no hard and fast answer,” Smith says. “There’s going to be interconnected processes which influence the recalcitrance of biomass reaching back to processes that our simulations cannot model,” such as how biomass molecules are synthesized.

Smith adds: “The basic mechanical process is how the cellulose chains are accessed and pulled apart. This is the kind of thing we hope to understand.”

If they succeed, the researchers will bring us one step closer to the day your car routinely runs on grass and sawdust.