Energy Matching: Unifying Flow Matching and Energy-Based Models for Generative Modeling

📅 2025-04-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Generative models often face limitations in incorporating prior knowledge or handling partial observations due to the conceptual and technical disconnect between flow matching and energy-based models (EBMs). Method: This paper introduces Energy Matching—a unified framework that jointly optimizes flow dynamics and energy potentials. It evolves distributions along optimal transport paths outside the data manifold while introducing an entropy-regularized energy term near the manifold, inducing convergence to a Boltzmann equilibrium distribution. Crucially, it relies solely on a single time-invariant scalar potential field—eliminating the need for time conditioning, auxiliary networks, or separate generators. Contribution/Results: By unifying Wasserstein gradient flows with explicit energy-driven dynamics, Energy Matching enables zero-shot inverse problem solving, controllable diversity generation, and multimodal exploration. On CIFAR-10, it achieves an FID of 3.97—improving upon the state-of-the-art EBM by 4.64—while preserving the sampling-free training advantage inherent to EBMs.

Technology Category

Application Category

📝 Abstract
Generative models often map noise to data by matching flows or scores, but these approaches become cumbersome for incorporating partial observations or additional priors. Inspired by recent advances in Wasserstein gradient flows, we propose Energy Matching, a framework that unifies flow-based approaches with the flexibility of energy-based models (EBMs). Far from the data manifold, samples move along curl-free, optimal transport paths from noise to data. As they approach the data manifold, an entropic energy term guides the system into a Boltzmann equilibrium distribution, explicitly capturing the underlying likelihood structure of the data. We parameterize this dynamic with a single time-independent scalar field, which serves as both a powerful generator and a flexible prior for effective regularization of inverse problems. Our method substantially outperforms existing EBMs on CIFAR-10 generation (FID 3.97 compared to 8.61), while retaining the simulation-free training of transport-based approaches away from the data manifold. Additionally, we exploit the flexibility of our method and introduce an interaction energy for diverse mode exploration. Our approach focuses on learning a static scalar potential energy -- without time conditioning, auxiliary generators, or additional networks -- marking a significant departure from recent EBM methods. We believe this simplified framework significantly advances EBM capabilities and paves the way for their broader adoption in generative modeling across diverse domains.
Problem

Research questions and friction points this paper is trying to address.

Unifies flow matching and energy-based models for generative modeling
Incorporates partial observations and priors via a static scalar field
Improves generation quality while retaining simulation-free training
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unifies flow matching with energy-based models
Uses single scalar field for generation
Achieves simulation-free training with transport
🔎 Similar Papers
No similar papers found.