Riemannian Consistency Model

📅 2025-10-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Consistency modeling with few steps on Riemannian manifolds remains challenging, as Euclidean methods fail to preserve intrinsic geometric constraints. Method: This work introduces the first extension of consistency models to non-Euclidean spaces, proposing a covariant-parameterized framework grounded in covariant derivatives and the exponential map. We formulate two equivalent variants—Riemannian Covariant Diffusion (RCD) and Riemannian Consistency Transport (RCT)—and design a simplified training objective, accompanied by a novel kinematic interpretation. Results: The method supports both continuous- and discrete-time training, conditional vector field modeling, and achieves high-fidelity, few-step (≤10 steps) generation on canonical manifolds—including the torus, sphere, and SO(3)—significantly outperforming existing Riemannian generative models.

Technology Category

Application Category

📝 Abstract
Consistency models are a class of generative models that enable few-step generation for diffusion and flow matching models. While consistency models have achieved promising results on Euclidean domains like images, their applications to Riemannian manifolds remain challenging due to the curved geometry. In this work, we propose the Riemannian Consistency Model (RCM), which, for the first time, enables few-step consistency modeling while respecting the intrinsic manifold constraint imposed by the Riemannian geometry. Leveraging the covariant derivative and exponential-map-based parameterization, we derive the closed-form solutions for both discrete- and continuous-time training objectives for RCM. We then demonstrate theoretical equivalence between the two variants of RCM: Riemannian consistency distillation (RCD) that relies on a teacher model to approximate the marginal vector field, and Riemannian consistency training (RCT) that utilizes the conditional vector field for training. We further propose a simplified training objective that eliminates the need for the complicated differential calculation. Finally, we provide a unique kinematics perspective for interpreting the RCM objective, offering new theoretical angles. Through extensive experiments, we manifest the superior generative quality of RCM in few-step generation on various non-Euclidean manifolds, including flat-tori, spheres, and the 3D rotation group SO(3).
Problem

Research questions and friction points this paper is trying to address.

Extends consistency models to Riemannian manifolds with curved geometry
Enables few-step generative modeling while respecting manifold constraints
Develops training objectives for non-Euclidean domains like spheres and SO(3)
Innovation

Methods, ideas, or system contributions that make the work stand out.

Riemannian Consistency Model for curved manifold generation
Closed-form training objectives using covariant derivatives
Simplified training eliminating complex differential calculations
🔎 Similar Papers
No similar papers found.