mCLM: A Function-Infused and Synthesis-Friendly Modular Chemical Language Model

📅 2025-05-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Current large language models (LLMs) face dual bottlenecks in molecular generation: atomic-level representations struggle to jointly optimize pharmacological efficacy and synthetic feasibility. To address this, we propose the modular Chemical Language Model (mCLM), which introduces a novel molecular representation paradigm grounded in functional building blocks as fundamental semantic units—enabling bilingual modeling of natural language descriptions and chemical building blocks. Methodologically, mCLM integrates modular tokenization, text–building-block joint pretraining, building-block–based generative modeling, and function-driven reinforcement fine-tuning, explicitly embedding synthetic constraints and ADMET property optimization objectives. Evaluated on 430 FDA-approved drugs, mCLM significantly improves 5 out of 6 key pharmaceutical chemistry metrics. Moreover, it successfully performs multi-round iterative optimization on several FDA-rejected compounds, systematically rectifying their drug-likeness deficiencies.

Technology Category

Application Category

📝 Abstract
Despite their ability to understand chemical knowledge and accurately generate sequential representations, large language models (LLMs) remain limited in their capacity to propose novel molecules with drug-like properties. In addition, the molecules that LLMs propose can often be challenging to make in the lab. To more effectively enable the discovery of functional small molecules, LLMs need to learn a molecular language. However, LLMs are currently limited by encoding molecules from atoms. In this paper, we argue that just like tokenizing texts into (sub-)word tokens instead of characters, molecules should be decomposed and reassembled at the level of functional building blocks, i.e., parts of molecules that bring unique functions and serve as effective building blocks for real-world automated laboratory synthesis. This motivates us to propose mCLM, a modular Chemical-Language Model tokenizing molecules into building blocks and learning a bilingual language model of both natural language descriptions of functions and molecule building blocks. By reasoning on such functional building blocks, mCLM guarantees to generate efficiently synthesizable molecules thanks to recent progress in block-based chemistry, while also improving the functions of molecules in a principled manner. In experiments on 430 FDA-approved drugs, we find mCLM capable of significantly improving 5 out of 6 chemical functions critical to determining drug potentials. More importantly, mCLM can reason on multiple functions and improve the FDA-rejected drugs (``fallen angels'') over multiple iterations to greatly improve their shortcomings.
Problem

Research questions and friction points this paper is trying to address.

LLMs struggle to propose novel drug-like molecules
Molecules proposed by LLMs are hard to synthesize
Current LLMs encode molecules only at atomic level
Innovation

Methods, ideas, or system contributions that make the work stand out.

Tokenizes molecules into functional building blocks
Learns bilingual model for descriptions and blocks
Guarantees synthesizable molecule generation
🔎 Similar Papers
No similar papers found.
Carl Edwards
Carl Edwards
Senior AI Scientist, Genentech
natural language processinginformation extractionchemistrydrug discoveryAI4Science
Chi Han
Chi Han
University of Illinois at Urbana-Champaign
Natural Language ProcessingScience of Language Models
G
Gawon Lee
Department of Chemistry, University of Illinois Urbana-Champaign
T
Thao Nguyen
Siebel School of Computing and Data Science, University of Illinois Urbana-Champaign
Bowen Jin
Bowen Jin
University of Illinois, Urbana Champaign
large language modelsagentsRL
C
Chetan Kumar Prasad
Department of Chemistry, University of Illinois Urbana-Champaign
S
Sara Szymku'c
Allchemy Inc.
B
Bartosz A. Grzybowski
Ulsan National Institute of Science and Technology, Institute of Organic Chemistry, Polish Academy of Sciences
Ying Diao
Ying Diao
Chemical and Biomolecular Engineering, University of Illinois at Urbana-Champaign
CrystallizationOrganic SemiconductorsMolecular AssemblyX-ray scattering techniquesPharmaceuticals
Jiawei Han
Jiawei Han
Abel Bliss Professor of Computer Science, University of Illinois
data miningdatabase systemsdata warehousinginformation networks
Ge Liu
Ge Liu
PhD in CSAIL, MIT; Assistant Professor @ CS, UIUC; Postdoc at IPD, UW
Machine learningcomputational biologyartificial intelligence
H
Hao Peng
Siebel School of Computing and Data Science, University of Illinois Urbana-Champaign
Martin D. Burke
Martin D. Burke
Professor of Chemistry, University of Illinois and Associate Dean for Research, CIMED
Lego-like SynthesisMolecular Prosthetics and Non-toxic fungicidals
Heng Ji
Heng Ji
Professor of Computer Science, AICE Director, ASKS Director, UIUC, Amazon Scholar
Natural Language ProcessingLarge Language Models