🤖 AI Summary
In molecular Boltzmann sampling, continuous normalizing flow (CNF)-based Boltzmann generators suffer from prohibitively expensive likelihood computation—requiring thousands of function evaluations per sample—hindering practical deployment. To address this, we propose a novel continuous flow modeling framework that jointly optimizes flow matching and reversibility regularization via an explicit hybrid objective, enabling high-likelihood accuracy and efficient sampling with minimal integration steps. Our method overcomes the CNF likelihood computation bottleneck and enables reliable importance sampling. Experiments on molecular Boltzmann sampling demonstrate that our approach achieves two orders of magnitude faster sampling than current state-of-the-art flow models, while preserving the accuracy of importance weight estimation.
📝 Abstract
Scalable sampling of molecular states in thermodynamic equilibrium is a long-standing challenge in statistical physics. Boltzmann Generators tackle this problem by pairing a generative model, capable of exact likelihood computation, with importance sampling to obtain consistent samples under the target distribution. Current Boltzmann Generators primarily use continuous normalizing flows (CNFs) trained with flow matching for efficient training of powerful models. However, likelihood calculation for these models is extremely costly, requiring thousands of function evaluations per sample, severely limiting their adoption. In this work, we propose Few-step Accurate Likelihoods for Continuous Flows (FALCON), a method which allows for few-step sampling with a likelihood accurate enough for importance sampling applications by introducing a hybrid training objective that encourages invertibility. We show FALCON outperforms state-of-the-art normalizing flow models for molecular Boltzmann sampling and is two orders of magnitude faster than the equivalently performing CNF model.