🤖 AI Summary
Current large language models (LLMs) face dual bottlenecks in molecular generation: atomic-level representations struggle to jointly optimize pharmacological efficacy and synthetic feasibility. To address this, we propose the modular Chemical Language Model (mCLM), which introduces a novel molecular representation paradigm grounded in functional building blocks as fundamental semantic units—enabling bilingual modeling of natural language descriptions and chemical building blocks. Methodologically, mCLM integrates modular tokenization, text–building-block joint pretraining, building-block–based generative modeling, and function-driven reinforcement fine-tuning, explicitly embedding synthetic constraints and ADMET property optimization objectives. Evaluated on 430 FDA-approved drugs, mCLM significantly improves 5 out of 6 key pharmaceutical chemistry metrics. Moreover, it successfully performs multi-round iterative optimization on several FDA-rejected compounds, systematically rectifying their drug-likeness deficiencies.
📝 Abstract
Despite their ability to understand chemical knowledge and accurately generate sequential representations, large language models (LLMs) remain limited in their capacity to propose novel molecules with drug-like properties. In addition, the molecules that LLMs propose can often be challenging to make in the lab. To more effectively enable the discovery of functional small molecules, LLMs need to learn a molecular language. However, LLMs are currently limited by encoding molecules from atoms. In this paper, we argue that just like tokenizing texts into (sub-)word tokens instead of characters, molecules should be decomposed and reassembled at the level of functional building blocks, i.e., parts of molecules that bring unique functions and serve as effective building blocks for real-world automated laboratory synthesis. This motivates us to propose mCLM, a modular Chemical-Language Model tokenizing molecules into building blocks and learning a bilingual language model of both natural language descriptions of functions and molecule building blocks. By reasoning on such functional building blocks, mCLM guarantees to generate efficiently synthesizable molecules thanks to recent progress in block-based chemistry, while also improving the functions of molecules in a principled manner. In experiments on 430 FDA-approved drugs, we find mCLM capable of significantly improving 5 out of 6 chemical functions critical to determining drug potentials. More importantly, mCLM can reason on multiple functions and improve the FDA-rejected drugs (``fallen angels'') over multiple iterations to greatly improve their shortcomings.