🤖 AI Summary
To address the low sampling efficiency of continuous normalizing flows (CNFs) in equilibrium distribution sampling for molecular systems, this work proposes a path-gradient-based fine-tuning strategy. Building upon a Flow Matching pre-trained CNF, we incorporate target-energy-guided path gradient optimization—requiring no architectural modification or additional sampling. This is the first approach to integrate path gradients with Flow Matching, enabling efficient gradient estimation via importance weighting while preserving both the geometric structure of flow trajectories and the stability of learned representations. Experiments across multiple molecular systems demonstrate up to a threefold improvement in sampling efficiency; computational overhead and trajectory length variations remain negligible. The results validate both the effectiveness and structural fidelity of the proposed method.
📝 Abstract
Boltzmann Generators have emerged as a promising machine learning tool for generating samples from equilibrium distributions of molecular systems using Normalizing Flows and importance weighting. Recently, Flow Matching has helped speed up Continuous Normalizing Flows (CNFs), scale them to more complex molecular systems, and minimize the length of the flow integration trajectories. We investigate the benefits of using path gradients to fine-tune CNFs initially trained by Flow Matching, in the setting where a target energy is known. Our experiments show that this hybrid approach yields up to a threefold increase in sampling efficiency for molecular systems, all while using the same model, a similar computational budget and without the need for additional sampling. Furthermore, by measuring the length of the flow trajectories during fine-tuning, we show that path gradients largely preserve the learned structure of the flow.