Pretraining Generative Flow Networks with Inexpensive Rewards for Molecular Graph Generation

📅 2025-03-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional GFlowNets rely on predefined molecular fragments, constraining exploration of chemical space. To address this, we propose Atom-level Generative Flow Networks (A-GFN), which employ individual atoms—not fragments—as building blocks, thereby eliminating structural constraints. A-GFN introduces an unsupervised pretraining scheme that jointly models multiple drug-relevant proxy rewards—including drug-likeness, polar surface area, and synthetic accessibility—followed by target-conditioned fine-tuning for property-directed generation. Pretrained on the ZINC dataset, A-GFN achieves state-of-the-art performance across multiple drug design benchmarks, significantly outperforming existing GFlowNets and mainstream generative models. Generated molecules exhibit high structural diversity, strong drug-likeness, and superior synthetic feasibility. To our knowledge, A-GFN is the first method to enable efficient, global exploration of drug-like chemical space via atomic-level operations.

Technology Category

Application Category

📝 Abstract
Generative Flow Networks (GFlowNets) have recently emerged as a suitable framework for generating diverse and high-quality molecular structures by learning from rewards treated as unnormalized distributions. Previous works in this framework often restrict exploration by using predefined molecular fragments as building blocks, limiting the chemical space that can be accessed. In this work, we introduce Atomic GFlowNets (A-GFNs), a foundational generative model leveraging individual atoms as building blocks to explore drug-like chemical space more comprehensively. We propose an unsupervised pre-training approach using drug-like molecule datasets, which teaches A-GFNs about inexpensive yet informative molecular descriptors such as drug-likeliness, topological polar surface area, and synthetic accessibility scores. These properties serve as proxy rewards, guiding A-GFNs towards regions of chemical space that exhibit desirable pharmacological properties. We further implement a goal-conditioned finetuning process, which adapts A-GFNs to optimize for specific target properties. In this work, we pretrain A-GFN on a subset of ZINC dataset, and by employing robust evaluation metrics we show the effectiveness of our approach when compared to other relevant baseline methods for a wide range of drug design tasks.
Problem

Research questions and friction points this paper is trying to address.

Generates diverse molecular structures using Generative Flow Networks.
Explores drug-like chemical space with individual atoms as building blocks.
Optimizes molecular properties using unsupervised pretraining and goal-conditioned finetuning.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Atomic GFlowNets use individual atoms as building blocks
Unsupervised pretraining with drug-like molecule datasets
Goal-conditioned finetuning optimizes specific target properties
🔎 Similar Papers