When IBM’s Deep Blue famously won its tournament rematch with world chess champion Garry Kasparov in 1997, the public’s imagination was set aflame. The computer relied on artificial intelligence algorithms to learn from experience, adjust to new information and perform human-like tasks. But when AI’s impact seemed to end at the chess board’s edge, the frenzy fizzled. Similar cycles of hype and letdown followed with AI’s domination of Jeopardy in 2011 and a Go world champion in 2016.
Yet through every “AI winter,” as experts came to call these periods of apparent dormancy, researchers plugged away, improving algorithms and extending uses. At every step, AI has advanced scientific knowledge, perhaps nowhere more than at the Department of Energy (DOE) national laboratories.
“DOE’s investment in the research, development and application of artificial intelligence has really tracked the field for decades, dating back to the 1980s,” says David Womble, AI program director at Oak Ridge National Laboratory (ORNL).
One major driver of DOE’s AI interest: the vast amount of data generated by its Office of Science user facilities, including particle accelerators and X-ray light sources. “Improvements in technology have made sensors of all sorts ubiquitous and, as a result, we’re awash in data,” Womble says. “Something like AI as an advanced data-analytics tool is absolutely critical to convert those bits into information that can be used to advance science and national security missions.”
The other major push for AI’s use comes from the growing capability of DOE’s supercomputers, including the nation’s most powerful machine, Summit, based at the Oak Ridge Leadership Computing Facility, a DOE Office of Science user facility at ORNL. With hardware optimized for such applications, Summit provides an ideal platform for AI-enabling breakthroughs, through tools known as machine-learning algorithms – which let computers learn from data and predict outcomes – and deep-learning algorithms, or human-brain-inspired neural networks that uncover patterns in large datasets.
Though DOE researchers have applied AI expertise and powerful computing resources to investigate a range of scientific disciplines, three fields are primed to benefit the most from this approach: basic science, materials science and drug discovery.
Among the earliest AI benefactors within DOE: large user facilities, such as the ORNL-based Spallation Neutron Source that provides the most intense pulsed neutron beams in the world for scientific research and industrial development. Unlike other computer-vision approaches that rely on hand-engineered features, neural networks learn directly from data the features needed for image-processing. “We actually get fairly sparse data out of a neutron source compared to our light sources,” Womble says. “But thanks to AI we can still produce very high-resolution images.”
AI is helping automate and streamline data-collection operations at a range of experimental facilities.
Another project that takes advantage of AI’s capacity to improve image processing is Multinode Evolutionary Neural Networks for Deep Learning (MENNDL), which is supported through DOE’s Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program. Open to researchers from academia, government laboratories and industry, INCITE is the primary way the scientific community gains access to the nation’s fastest supercomputers, including Summit, Theta, and, starting in 2023, Aurora and Frontier.
Scientists at Fermilab, near Chicago, use MENNDL as part of a neutrino-scattering experiment called MINERvA to better understand how the extreme low-mass particles interact with ordinary matter. MENNDL optimizes neural networks that analyze images and pinpoints where neutrinos interact with one of many targets – a job akin to identifying the source of each twinkling ember in a starburst of fireworks. In only 24 hours, MENNDL produced optimized networks that outperformed any previously hand-built program, an achievement that could have otherwise taken scientists months to accomplish.
But AI isn’t only giving researchers a better way to interrogate their data. It’s also helping automate and streamline data-collection operations at a range of experimental facilities. For example, the Spallation Neutron Source is harnessing AI to maximize the efficiency of the accelerator generating the ions that produce the neutrons.
One of the more common AI applications is in materials science, to answer questions about conductivity, melting points, grain structure and other fundamental properties. Whereas traditional trial-and-error methods to answer these questions are inefficient and time-consuming, machine learning can spark insights by learning rules from data sets and building predictive models. “Machine learning is really ideal for looking at experimental results from hundreds of materials and finding the complex correlations between the materials’ properties,” Womble says. “Our scientists can then select a small number of candidates and processes for creating prototype materials” for lab testing.
AI-driven models are expected to lead to new catalysts, alloys, superconductors, and chemical-separation and materials-synthesis methods, with potential impact on energy production, delivery and use.
Another process DOE scientists are tackling with AI is drug discovery, which traditionally consumes five years and about a third of all research and development dollars spent to bring a drug to market, notes Marti Head, director of the Joint Institute for Biological Sciences at ORNL and lead on Accelerating Therapeutic Opportunities for Medicine (ATOM). “Improving this process is of paramount importance to the industry, and to society,” she says. “Artificial Intelligence is really baked into what we’re trying to do.”
ATOM is a public-private partnership of DOE (acting through Lawrence Livermore National Laboratory), the National Cancer Institute (acting through Frederick National Laboratory), the University of California, San Francisco, and GlaxoSmithKline Pharmaceuticals. The ATOM team is combining GSK’s chemical and in vitro biological data for more than 2 million compounds to generate new dynamic models that better predict how molecules will behave in the body. ATOM will also use GSK’s preclinical and clinical information on 500 molecules that have failed in development, with data available publicly and from future partners. In this effort, LLNL will contribute its best-in-class supercomputers, including its next-generation system Sierra, plus its expertise in modeling and simulation, cognitive computing, machine learning and algorithm development.
“To get to a drug, you need a molecule that will work, be safe and make a difference against the disease, and artificial intelligence is really well placed to optimize for many of these parameters at once,” Head says. “Eventually, we’ll want to include ease of manufacturing and cost of manufacturing as features that you would incorporate in the optimization, but we have to walk before we run.”
Computer scientists are democratizing artificial intelligence, devising a way to enable virtually anyone to train… Read More
Electrifying transportation and storing renewable energy require improving batteries, including their electrolytes. These fluids help… Read More
During the pandemic turmoil, Margaret Cheung reconsidered her career. At the University of Houston, she… Read More
Growing up in the remote countryside of China’s Hunan province, Z.J. Wang didn’t see trains… Read More
Princeton Plasma Physics Laboratory (PPPL) scientists are creating simulations to advance magnetic mirror technology, a… Read More
Niall Mangan uses data to explore the underlying mechanisms of energy systems, particularly ones for… Read More