Quantum computers hold the promise of processing information exponentially faster than traditional machines — 100 million times the speed of your laptop. They do this by replacing the either-or 1’s and 0’s that now direct operations. In their place would be quantum bits, or qubits, which can be a 1 or a 0 at the same time, a quantum physics concept called superposition.

Such machines would find many uses, from optimizing flight paths or long-haul trucking routes and helping physicians triage patients to solving fundamental science inquiries by supercharging artificial intelligence.

One big barrier to a quantum future: manufacturing quantum chips. Chips today are mass-produced, but no factory can grind out the parts that would go inside quantum computers.

Aiichiro Nakano, a professor of computer science at the University of Southern California’s Viterbi School of Engineering, wants to change that, to produce quantum materials at scale. Nakano, with his colleagues Rajiv Kalia, Ken-ichi Nomura and Priya Vashishta, has a Department of Energy INCITE (Innovative and Novel Computational Impact on Theory and Experiment) award to use AI and supercomputers to simulate the manufacturing and performance of quantum materials. This project, Nakano says, is also related to the National Science Foundation’s Future Manufacturing program.

“In the past, the U.S. has offloaded manufacturing like chip fabrication overseas,” Nakano says. “Suddenly, we realize we are empty.” The NSF program is part of a strategy to secure a domestic quantum chip supply by mastering manufacturing processes.

As part of his INCITE allocation, Nakano and his colleagues will work with the DOE Argonne Leadership Computing Facility to seek manufacturing materials suitable for quantum computers.

The chips Nakano and his team are computationally manufacturing are two-dimensional materials in an atomically thin sheet. An established process for reliably making 2D materials does not exist, Nakano notes. In fact, the first 2D sheet was found by accident in 2004 when a pair of researchers were wrapping scotch tape around ordinary schoolroom pencils and extracted a single layer of carbon — graphene. This discovery landed them a Nobel Prize in 2010.

‘We have the exact position of all the atoms or electrons all the time.’

Although their discovery opened the door to new, innovative materials, serendipity doesn’t scale, Nakano says.

With an earlier INCITE award, Nakano and his USC team created a neural network algorithm that enables them to study their candidate materials’ molecular dynamics. They adapted the Allegro model, initially developed by computational scientist Boris Kozinsky at Harvard University, to fit their needs.

“Allegro was a breakthrough, achieving quantum-mechanical accuracy and computational speed,” Nakano says. “Before that, people were saying that AI and neural networks could not achieve those accuracies.”

Nakano’s group quickly discovered that Allegro couldn’t scale to the number of predictions his group wanted to perform — about a billion atoms over millions of time steps — so they added a step, called a sharpness awareness minimization. The resulting model, which Nakano’s group called Allegro-Legato, allowed them to stably run the algorithm while maintaining the same degree of speed and accuracy for their needs. Now, Allegro-Legato allows Nakano’s team to run larger-scale simulations than were previously possible.

Nakano and his colleagues are now studying the properties of 2D transition metal dichalcogenides, which have natural semiconducting properties. These compounds are comprised of a transition metal atom — such as molybdenum or tungsten — coupled to sulfur, selenium or tellurium. These materials, Nakano says, could be used to build a transistor for ultralow-power computer chips or quantum emitters — quantum devices that would produce light —  for quantum computers. His team’s also modeling the assembly of oxide ferroelectrics in a single layer, which could be used to build memory chips.

With the simulations, Nakano says, “we have the exact position of all the atoms or electrons all the time.”

The computing tools allow Nakano and his team not only to grow their materials in a single layer but also to layer other materials on top of the metal dichalcogenides. This step of the simulation, he says, is critical.

“Making the semiconductor is fast, but the semiconductor itself is not a device: you need an insulator,” he says, to prevent electrical leakage. “Nobody knows how to oxidize the 2D transition metal dichalcogenides in a scalable and precise manner,” he says. Using AI and neural networks, he and his colleagues learned how to introduce oxygen at different pressures to get a layer of oxide to insulate their materials. After, they can study the structure of the whole device by stimulating it with light or by applying an electric field.

To validate their results using advanced X-ray and neutron experimental facilities, Nakano collaborates with researchers at Stanford University and Oak Ridge National Laboratory. His team also collaborates with Rafael Jaramillo, a materials scientist at the Massachusetts Institute of Technology, who validates their experimental methods with Nakano’s computational findings.

With the current INCITE project, Nakano is excited to generate new materials by manipulating the existing ones into different shapes. Even rotating the sheet creates a different material, he explains. “We are rotating this to see what kind of materials we can predict.”

They’d also hope to find a configuration of the material that can reduce electrical consumption for energy sustainability. Supercomputers use an immense amount of energy. That a single supercomputer might consume as much power as a small city each year concerns Nakano, and he sees his work on developing energy-efficient quantum materials as part of the solution.

“There’s no doubt that AI is very useful,” he adds. “But it is becoming more and more energy hungry, so it’s not sustainable.”

Bill Cannon

Share
Published by
Bill Cannon

Recent Posts

We the AI trainers

Computer scientists are democratizing artificial intelligence, devising a way to enable virtually anyone to train… Read More

November 12, 2024

AI turbocharge

Electrifying transportation and storing renewable energy require improving batteries, including their electrolytes. These fluids help… Read More

October 30, 2024

Pandemic preparedness

During the pandemic turmoil, Margaret Cheung reconsidered her career. At the University of Houston, she… Read More

October 16, 2024

A heavy lift

Growing up in the remote countryside of China’s Hunan province, Z.J. Wang didn’t see trains… Read More

August 1, 2024

Frugal fusion

Princeton Plasma Physics Laboratory (PPPL) scientists are creating simulations to advance magnetic mirror technology, a… Read More

July 16, 2024

A deeper shade of green

Niall Mangan uses data to explore the underlying mechanisms of energy systems, particularly ones for… Read More

June 26, 2024