π€ AI Summary
This work addresses the high computational cost of evaluating Jacobian determinants in continuous normalizing flows for high-dimensional Boltzmann sampling, where existing stochastic estimators often suffer from bias or excessive variance. The authors propose a multi-step unbiased Jacobian estimation framework that discretizes the probability flow ordinary differential equation and performs unbiased incremental estimation at each integration step. By integrating this approach with sequential Monte Carlo (SMC), the method substantially reduces estimator variance while preserving unbiasedness. This advancement overcomes the longstanding trade-off between computational efficiency and statistical accuracy in high-dimensional sampling. Empirical results demonstrate significant performance gains over both Hutchinsonβs estimator and single-step Flow Perturbation baselines on a 1000-dimensional Gaussian mixture model and an all-atom Chignolin protein system.
π Abstract
The scalability of continuous normalizing flows (CNFs) for unbiased Boltzmann sampling remains limited in high-dimensional systems due to the cost of Jacobian-determinant evaluation, which requires $D$ backpropagation passes through the flow layers. Existing stochastic Jacobian estimators such as the Hutchinson trace estimator reduce computation but introduce bias, while the recently proposed Flow Perturbation method is unbiased yet suffers from high variance. We present \textbf{Flow Perturbation++}, a variance-reduced extension of Flow Perturbation that discretizes the probability-flow ODE and performs unbiased stepwise Jacobian estimation at each integration step. This multi-step construction retains the unbiasedness of Flow Perturbation while achieves substantially lower estimator variance. Integrated into a Sequential Monte Carlo framework, Flow Perturbation++ achieves significantly improved equilibrium sampling on a 1000D Gaussian Mixture Model and the all-atom Chignolin protein compared with Hutchinson-based and single-step Flow Perturbation baselines.