Foundation Models for Discovery and Exploration in Chemical Space

📅 2025-10-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing methods struggle to simultaneously achieve scalability in chemical space exploration and high accuracy in predicting molecular atomic, thermodynamic, and kinetic properties—hindering materials innovation. To address this, we propose the MIST family of molecular foundation models, introducing the first tokenization scheme that jointly encodes nuclear, electronic, and geometric information. We further develop a hyperparameter-penalized Bayesian neural scaling law that reduces training cost by an order of magnitude while uncovering spontaneously emergent, scientifically interpretable regularities within the model. Leveraging self-supervised learning and mechanistic interpretability analysis, MIST achieves state-of-the-art performance across 400+ tasks spanning physiology, electrochemistry, and quantum chemistry. It demonstrates practical utility in real-world applications including electrolyte screening, odor modeling, and isotope half-life prediction.

Technology Category

Application Category

📝 Abstract
Accurate prediction of atomistic, thermodynamic, and kinetic properties from molecular structures underpins materials innovation. Existing computational and experimental approaches lack the scalability required to efficiently navigate chemical space. Scientific foundation models trained on large unlabeled datasets offer a path toward exploring chemical space across diverse application domains. Here we develop MIST, a family of molecular foundation models with up to an order of magnitude more parameters and data than prior works. Trained using a novel tokenization scheme that comprehensively captures nuclear, electronic, and geometric information, MIST learns from a diverse range of molecules. MIST models have been fine-tuned to predict more than 400 structure -- property relationships and match or exceed state-of-the-art performance across benchmarks spanning physiology, electrochemistry, and quantum chemistry. We demonstrate the ability of these models to solve real-world problems across chemical space, including multiobjective electrolyte solvent screening, olfactory perception mapping, isotope half-life prediction, stereochemical reasoning for chiral organometallic compounds, and binary and multi-component mixture property prediction. Probing MIST models using mechanistic interpretability methods reveals identifiable patterns and trends not explicitly present in the training data, suggesting that the models learn generalizable scientific concepts. We formulate hyperparameter-penalized Bayesian neural scaling laws and use them to reduce the computational cost of model development by an order of magnitude. The methods and findings presented here represent a significant step toward accelerating materials discovery, design, and optimization using foundation models and provide valuable guidance for training compute-optimal scientific foundation models.
Problem

Research questions and friction points this paper is trying to address.

Developing scalable foundation models for chemical space exploration
Predicting diverse molecular properties across multiple scientific domains
Accelerating materials discovery through compute-optimal model training
Innovation

Methods, ideas, or system contributions that make the work stand out.

Developed large-scale molecular foundation models MIST
Introduced novel tokenization capturing molecular information comprehensively
Used Bayesian scaling laws to reduce computational costs
🔎 Similar Papers
No similar papers found.
A
Alexius Wadell
Department of Mechanical Engineering, University of Michigan, Ann Arbor, MI, USA
A
Anoushka Bhutani
Department of Mechanical Engineering, University of Michigan, Ann Arbor, MI, USA
V
Victor Azumah
Department of Chemical Engineering, University of Michigan, Ann Arbor, MI, USA
A
Austin R. Ellis-Mohr
Department of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign, Urbana, IL, USA
C
Celia Kelly
Department of Mechanical Engineering, University of Michigan, Ann Arbor, MI, USA
Hancheng Zhao
Hancheng Zhao
University of Michigan
A
Anuj K. Nayak
Department of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign, Urbana, IL, USA
Kareem Hegazy
Kareem Hegazy
Postdoc, UC Berkeley
Scientific Machine LearningPhysicsUltrafast Diffraction Imaging
A
Alexander Brace
Department of Computer Science, University of Chicago, Chicago, IL, USA
H
Hongyi Lin
Department of Mechanical Engineering, University of Michigan, Ann Arbor, MI, USA
M
Murali Emani
Argonne National Laboratory, Lemont, IL, USA
Venkatram Vishwanath
Venkatram Vishwanath
Computer Scientist, Argonne National Laboratory
High Performance ComputingData Intensive ComputingComputer NetworksComputer ArchitectureMachine Learning
K
Kevin Gering
Idaho National Laboratory, Idaho Falls, ID, USA
M
Melisa Alkan
NVIDIA Corporation, Santa Clara, CA, USA
T
Tom Gibbs
NVIDIA Corporation, Santa Clara, CA, USA
J
Jack Wells
NVIDIA Corporation, Santa Clara, CA, USA
Lav R. Varshney
Lav R. Varshney
Stony Brook University
Artificial IntelligenceInformation TheorySignal ProcessingNeuroscienceNetwork Science
Bharath Ramsundar
Bharath Ramsundar
Deep Forest Sciences, DeepChem, previously Computable, previously Stanford
Differentiable PhysicsDrug DiscoveryMachine LearningCryptographyCryptoeconomics
Karthik Duraisamy
Karthik Duraisamy
University of Michigan
Computational ModelingMultiscale ModelingAI-Augmented ScienceTurbulence Modeling & Simulations
M
Michael W. Mahoney
International Computer Science Institute, Berkeley, CA, USA
Arvind Ramanathan
Arvind Ramanathan
Argonne National Laboratory
Machine LearningComputational BiologyMolecular biophysicsenzyme catalysishigher-order statistics
Venkatasubramanian Viswanathan
Venkatasubramanian Viswanathan
Associate Professor, University of Michigan
BatteriesFuel CellsElectrocatalysisScientific Machine LearningElectric Mobility