The Geometry of Noise: Why Diffusion Models Don't Need Noise Conditioning

πŸ“… 2026-02-20
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the fundamental tension between gradient divergence and instability of noise-free conditional diffusion models near data manifolds by introducing the concept of β€œmarginal energy.” It formulates unconditional diffusion as a Riemannian gradient flow induced by this marginal energy, thereby uncovering its intrinsic geometric structure. Through a decomposition of marginal and relative energies, the study reveals that neural networks achieve stability by leveraging a local conformal metric to counteract geometric singularities, elucidating the theoretical origin of the inherent stability in velocity parameterization. Integrating tools from differential geometry, probability density integration, and vector field learning, the paper establishes structural stability for autonomous generative models, explains the failure mechanism of noise-prediction parameterizations, and derives theoretical conditions guaranteeing stable sampling.

Technology Category

Application Category

πŸ“ Abstract
Autonomous (noise-agnostic) generative models, such as Equilibrium Matching and blind diffusion, challenge the standard paradigm by learning a single, time-invariant vector field that operates without explicit noise-level conditioning. While recent work suggests that high-dimensional concentration allows these models to implicitly estimate noise levels from corrupted observations, a fundamental paradox remains: what is the underlying landscape being optimized when the noise level is treated as a random variable, and how can a bounded, noise-agnostic network remain stable near the data manifold where gradients typically diverge? We resolve this paradox by formalizing Marginal Energy, $E_{\text{marg}}(\mathbf{u}) = -\log p(\mathbf{u})$, where $p(\mathbf{u}) = \int p(\mathbf{u}|t)p(t)dt$ is the marginal density of the noisy data integrated over a prior distribution of unknown noise levels. We prove that generation using autonomous models is not merely blind denoising, but a specific form of Riemannian gradient flow on this Marginal Energy. Through a novel relative energy decomposition, we demonstrate that while the raw Marginal Energy landscape possesses a $1/t^p$ singularity normal to the data manifold, the learned time-invariant field implicitly incorporates a local conformal metric that perfectly counteracts the geometric singularity, converting an infinitely deep potential well into a stable attractor. We also establish the structural stability conditions for sampling with autonomous models. We identify a ``Jensen Gap''in noise-prediction parameterizations that acts as a high-gain amplifier for estimation errors, explaining the catastrophic failure observed in deterministic blind models. Conversely, we prove that velocity-based parameterizations are inherently stable because they satisfy a bounded-gain condition that absorbs posterior uncertainty into a smooth geometric drift.
Problem

Research questions and friction points this paper is trying to address.

diffusion models
noise conditioning
autonomous generative models
marginal energy
geometric stability
Innovation

Methods, ideas, or system contributions that make the work stand out.

Marginal Energy
Autonomous Diffusion Models
Riemannian Gradient Flow
Geometric Singularity
Velocity Parameterization
πŸ”Ž Similar Papers
No similar papers found.