π€ AI Summary
Existing geometric generative models on Riemannian manifolds rely on multi-step numerical integration, incurring high inference overhead and limiting applicability in practical domains such as protein backbone generation, computational chemistry, and geospatial modeling. To address this, we propose the Generalized Flow Mapping (GFM) frameworkβthe first systematic extension of few-step Euclidean generative models (e.g., consistency models) to arbitrary Riemannian manifolds. GFM unifies implicit probability flows with manifold differential geometry via self-distillation training. It integrates generalized Lagrangian and Eulerian descriptions with progressive flow mappings, enabling efficient single-step or few-step sampling in non-Euclidean spaces. Experiments demonstrate that GFM achieves state-of-the-art sample quality across diverse geometric datasets and attains either superior or competitive log-likelihood performance.
π Abstract
Geometric data and purpose-built generative models on them have become ubiquitous in high-impact deep learning application domains, ranging from protein backbone generation and computational chemistry to geospatial data. Current geometric generative models remain computationally expensive at inference -- requiring many steps of complex numerical simulation -- as they are derived from dynamical measure transport frameworks such as diffusion and flow-matching on Riemannian manifolds. In this paper, we propose Generalised Flow Maps (GFM), a new class of few-step generative models that generalises the Flow Map framework in Euclidean spaces to arbitrary Riemannian manifolds. We instantiate GFMs with three self-distillation-based training methods: Generalised Lagrangian Flow Maps, Generalised Eulerian Flow Maps, and Generalised Progressive Flow Maps. We theoretically show that GFMs, under specific design decisions, unify and elevate existing Euclidean few-step generative models, such as consistency models, shortcut models, and meanflows, to the Riemannian setting. We benchmark GFMs against other geometric generative models on a suite of geometric datasets, including geospatial data, RNA torsion angles, and hyperbolic manifolds, and achieve state-of-the-art sample quality for single- and few-step evaluations, and superior or competitive log-likelihoods using the implicit probability flow.