A Foundation Model for Chemical Design and Property Prediction

📅 2024-10-28
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Chemical AI models suffer from poor generalization, high task-specific adaptation costs, and limited cross-task transferability. To address these challenges, we introduce ChemFM—the first general-purpose large language model for chemistry—featuring 3.0 billion parameters and pretrained via self-supervised causal language modeling on 178 million molecular SMILES/SELFIES sequences. ChemFM supports both full-parameter fine-tuning and parameter-efficient methods (e.g., LoRA), enabling unified adaptation to three core chemical AI tasks: molecular property prediction, conditional molecular generation, and reaction prediction. On 34 property prediction benchmarks, it achieves an average improvement of 67.48%; reduces conditional generation bias by 33.80%; and improves top-1 reaction prediction accuracy by up to 3.7%. Notably, ChemFM significantly enhances modeling of critical biological properties—including antibiotic activity and cytotoxicity—establishing a scalable foundation model for novel antibiotic discovery.

Technology Category

Application Category

📝 Abstract
Artificial intelligence (AI) has significantly advanced computational chemistry research in various tasks. However, traditional AI methods often rely on task-specific model designs and training, which constrain both the scalability of model size and generalization across different tasks. Here, we introduce ChemFM, a large foundation model specifically developed for chemicals. ChemFM comprises 3 billion parameters and is pre-trained on 178 million molecules using self-supervised causal language modeling to extract generalizable molecular representations. This model can be adapted to diverse downstream chemical applications using either full-parameter or parameter-efficient fine-tuning methods. ChemFM consistently outperforms state-of-the-art task-specific AI models across all tested tasks. Notably, it achieves up to 67.48% performance improvement across 34 property prediction benchmarks, up to 33.80% reduction in mean average deviation between conditioned and actual properties of generated molecules in conditional molecular generation tasks, and up to 3.7% top-1 accuracy improvement across 4 reaction prediction datasets. Moreover, ChemFM demonstrates its superior performance in predicting antibiotic activity and cytotoxicity, highlighting its potential to advance the discovery of novel antibiotics. We anticipate that ChemFM will significantly advance chemistry research by providing a foundation model capable of effectively generalizing across a broad range of tasks with minimal additional training.
Problem

Research questions and friction points this paper is trying to address.

AI Model Limitations
Chemical Research
Flexibility and Efficiency
Innovation

Methods, ideas, or system contributions that make the work stand out.

ChemFM
Molecular Property Prediction
Drug Discovery Acceleration
🔎 Similar Papers
No similar papers found.
F
Feiyang Cai
School of Computing, Clemson University, Clemson, 29634, SC, USA.
Tianyu Zhu
Tianyu Zhu
Beihang University
Recommender SystemsData Mining
T
Tzuen-Rong Tzeng
Department of Biological Sciences, Clemson University, Clemson, 29634, SC, USA.
Y
Yongping Duan
Horticultural Research Laboratory, USDA, Fort Pierce, 34945, FL, USA.
L
Ling Liu
College of Computing, Georgia Institute of Technology, Atlanta, 30332, GA, USA.
Srikanth Pilla
Srikanth Pilla
Professor and Director, Center for Composite Materials, University of Delaware
G
Gang Li
Department of Mechanical Engineering, Clemson University, Clemson, 29634, SC, USA.
F
Feng Luo
School of Computing, Clemson University, Clemson, 29634, SC, USA.