Building a chemical ‘GPT’ to help design a key battery component
Taking inspiration from the word-predicting large language models, a U-M team is kickstarting an atom-predicting model with 200,000 node hours on Argonne’s Polaris
Now that ChatGPT has revealed connections in meaning that can emerge from the simple premise of predicting the next word, a team of researchers led by the University of Michigan aims to do the same for atoms strung together to build molecules.
The research is supported with a one-year grant from the Department of Energy providing 200,000 node hours on Polaris, a 34-petaflop supercomputer at Argonne National Laboratory. The team will build a foundational model for molecules, similar to the GPT models that support applications like ChatGPT. The new model will focus on small organic molecules with relevance to energy storage and conversion applications—mainly composed of carbon, hydrogen, oxygen and nitrogen.
“What we’ve learned from language models is that size matters. The interesting behavior happens when you make it very big,” said Venkat Viswanathan, U-M associate professor of aerospace engineering and principal investigator of the award. “If you train on a small amount of data and ask it to write Shakespeare, it’s not good. But a larger data set is better, and when it’s big enough, text that sounds like Shakespeare emerges.”
The team is planning to use their model to predict better battery electrolytes—the medium through which ions shuttle from one electrode to the other during charge and discharge cycles. Many electrode pairs offer the potential for higher energy densities than our current lithium-based batteries, but the electrolyte needs to work with both electrodes. That’s often a tall order, and the team is betting AI can help.
As with GPT’s steady diet of text with no annotation, the chemical model will be fed text-based representations of atomic structures with no additional information like chemical properties.
“There are several billions of molecules that are possible to make, and we have the text-based representation for them,” Viswanathan said. “Our slice of that will be synthesizable small molecules similar to those used in pharmaceuticals and electrolytes.”
When the model is able to predict missing atoms in small organic compounds, the team will move on to fine-tuning—feeding it the properties of some compounds and asking it to predict the properties of others. Through iterative feedback, they intend to build an AI that can master small organic molecule chemistry.
“Deep Forest Sciences has built considerable expertise in applying molecular foundation models to drug discovery, and we are excited to apply that knowledge to batteries,” said Bharath Ramsundar, founder and CEO of Deep Forest Sciences, and a key collaborator in building the model.
Once the model is up and running, the team will ask it to predict electrolytes suited to a particular pair of electrodes. It will then experimentally test each prescription in the lab with a robotic setup, Clio, that was developed by Viswanathan and collaborator Jay Whitacre, professor of materials science, engineering and public policy at Carnegie Mellon University.
They are also looking for new design rules that may emerge from the model—rules that humans haven’t been able to consider.
“When we learn chemistry, we learn each rule, and then we learn about a dozen exceptions,” Viswanathan said. “Can this now help us learn better rules or be able to design with more sophisticated combined rules?”
As an example, he said that it was easy for humans to add one atom to a structure and see whether it works or not. Adding pairs of atoms to the structure at different locations results in too many possibilities for humans to easily parse. But AI might be able to do it.
The DOE’s INCITE program, which provides the funding for this study, was conceived to seek out computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering.