ChemBART: A Pre-trained BART Model Assisting Organic Chemistry Analysis

📅 2026-01-06
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitation of existing chemical language models, which are typically confined to single tasks and thus ill-suited for the multi-task coordination required in organic synthesis planning. The authors propose a unified pretraining paradigm for chemical reactions based on the BART architecture, employing masked SMILES sequence reconstruction to enable a versatile framework capable of precursor/reagent generation, temperature–yield regression, molecular property classification, and multi-step retrosynthetic pathway planning. By integrating Monte Carlo tree search with reinforcement learning, the model generates concise, high-yielding synthetic routes that were successfully validated in wet-lab experiments, achieving approximately 30% higher yields than literature benchmarks. This study represents the first demonstration of a “pretrain once, apply to many tasks” approach that directly guides experimental synthesis.

Technology Category

Application Category

📝 Abstract
Recent advances in large language models (LLMs) have demonstrated transformative potential across diverse fields. While LLMs have been applied to molecular simplified molecular input line entry system (SMILES) in computer-aided synthesis planning (CASP), existing methodologies typically address single tasks, such as precursor prediction. We introduce ChemBART, a SMILES-based LLM pre-trained on chemical reactions, which enables a unified model for multiple downstream chemical tasks--achieving the paradigm of"one model, one pre-training, multiple tasks."By leveraging outputs from a mask-filling pre-training task on reaction expressions, ChemBART effectively solves a variety of chemical problems, including precursor/reagent generation, temperature-yield regression, molecular property classification, and optimizing the policy and value functions within a reinforcement learning framework, integrated with Monte Carlo tree search for multi-step synthesis route design. Unlike single-molecule pre-trained LLMs constrained to specific applications, ChemBART addresses broader chemical challenges and integrates them for comprehensive synthesis planning. Crucially, ChemBART-designed multi-step synthesis routes and reaction conditions directly inspired wet-lab validation, which confirmed shorter pathways with ~30% yield improvement over literature benchmarks. Our work validates the power of reaction-focused pre-training and showcases the broad utility of ChemBART in advancing the complete synthesis planning cycle.
Problem

Research questions and friction points this paper is trying to address.

large language models
organic chemistry analysis
computer-aided synthesis planning
multi-task learning
chemical reaction prediction
Innovation

Methods, ideas, or system contributions that make the work stand out.

ChemBART
reaction-based pre-training
multi-task chemical modeling
SMILES-based LLM
synthesis planning
🔎 Similar Papers
Kenan Li
Kenan Li
Assistant Professor, Saint Louis University
public healthGISspatial statisticssystem dynamicsgeo-AI
Y
Yijian Zhang
State Key Laboratory of Analytical Chemistry for Life Science, Kuang Yaming Honors School , Chemistry and Biomedicine Innovation Centre (ChemBIC), ChemBioMed Interdisciplinary Research Centre at Nanjing University, Engineering Research Centre of Protein and Peptide Medicine of the Ministry of Education, Institute for Brain Sciences, Nanjing University, Nanjing 210023, China.
J
Jin Wang
Beijing National Laboratory for Molecular Sciences, Key Laboratory of Bioorganic Chemistry and Molecular Engineering of Ministry of Education, College of Chemistry and Molecular Engineering, and Peking-Tsinghua Center for Life Sciences, Peking University, Beijing 100871, China
H
Haipeng Gan
State Key Laboratory of Analytical Chemistry for Life Science, Kuang Yaming Honors School , Chemistry and Biomedicine Innovation Centre (ChemBIC), ChemBioMed Interdisciplinary Research Centre at Nanjing University, Engineering Research Centre of Protein and Peptide Medicine of the Ministry of Education, Institute for Brain Sciences, Nanjing University, Nanjing 210023, China.
Z
Zeying Sun
Beijing National Laboratory for Molecular Sciences, Key Laboratory of Bioorganic Chemistry and Molecular Engineering of Ministry of Education, College of Chemistry and Molecular Engineering, and Peking-Tsinghua Center for Life Sciences, Peking University, Beijing 100871, China
X
Xiaoguang Lei
Beijing National Laboratory for Molecular Sciences, Key Laboratory of Bioorganic Chemistry and Molecular Engineering of Ministry of Education, College of Chemistry and Molecular Engineering, and Peking-Tsinghua Center for Life Sciences, Peking University, Beijing 100871, China
Hao Dong
Hao Dong
Nanjing University
computational chemistrymulti-scale modelingmachine learning