HollowFlow: Efficient Sample Likelihood Evaluation using Hollow Message Passing

📅 2025-10-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
A key bottleneck in flow- and diffusion-based generative models—such as Boltzmann generators—is the cubic scaling O(n³) of likelihood evaluation complexity with system dimension n, severely limiting scalability to large-scale scientific systems. To address this, we propose HollowFlow: an efficient likelihood estimation framework built upon non-backtracking graph neural networks (NoBGNNs). Its core innovations are a block-diagonal Jacobian constraint and a non-backtracking message-passing scheme, which decouple backward-pass count from system size and reduce theoretical complexity to O(n). HollowFlow is architecture-agnostic, seamlessly integrating any equivariant GNN or attention-based backbone, ensuring strong generality and scalability. Experiments across multiscale scientific modeling tasks demonstrate up to 100× acceleration in both sampling and likelihood evaluation, substantially advancing the practical applicability of Boltzmann generators to high-dimensional physical and chemical systems.

Technology Category

Application Category

📝 Abstract
Flow and diffusion-based models have emerged as powerful tools for scientific applications, particularly for sampling non-normalized probability distributions, as exemplified by Boltzmann Generators (BGs). A critical challenge in deploying these models is their reliance on sample likelihood computations, which scale prohibitively with system size $n$, often rendering them infeasible for large-scale problems. To address this, we introduce $ extit{HollowFlow}$, a flow-based generative model leveraging a novel non-backtracking graph neural network (NoBGNN). By enforcing a block-diagonal Jacobian structure, HollowFlow likelihoods are evaluated with a constant number of backward passes in $n$, yielding speed-ups of up to $mathcal{O}(n^2)$: a significant step towards scaling BGs to larger systems. Crucially, our framework generalizes: $ extbf{any equivariant GNN or attention-based architecture}$ can be adapted into a NoBGNN. We validate HollowFlow by training BGs on two different systems of increasing size. For both systems, the sampling and likelihood evaluation time decreases dramatically, following our theoretical scaling laws. For the larger system we obtain a $10^2 imes$ speed-up, clearly illustrating the potential of HollowFlow-based approaches for high-dimensional scientific problems previously hindered by computational bottlenecks.
Problem

Research questions and friction points this paper is trying to address.

Addresses computational bottlenecks in flow-based generative models for scientific applications
Reduces sample likelihood evaluation complexity from O(n²) to constant time
Enables scaling Boltzmann Generators to larger systems via hollow message passing
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses non-backtracking graph neural network architecture
Enforces block-diagonal Jacobian for constant backward passes
Achieves quadratic speed-up in likelihood evaluation scaling
🔎 Similar Papers
No similar papers found.
J
Johann Flemming Gloy
Department of Computer Science and Engineering, Chalmers University of Technology and University of Gothenburg, SE-41296 Gothenburg, Sweden.
Simon Olsson
Simon Olsson
Chalmers University of Technology
Machine LearningAI for ScienceMolecular SimulationsInverse Molecular Design