Overclocking Electrostatic Generative Models

📅 2025-09-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Electrostatic generative models (e.g., PFGM++) achieve strong image synthesis performance but suffer from low sampling efficiency due to computationally expensive ODE solvers. To address this, we propose the Inverse Poisson Flow Matching (IPFM) distillation framework—the first to formulate generative model distillation as an inverse problem—where a student network directly learns to reconstruct the teacher’s electrostatic field. Unlike prior approaches, IPFM does not rely on the infinite-dimensional limit (D → ∞); theoretical analysis shows faster convergence under finite D, ensuring both optimization stability and high sampling efficiency. IPFM is compatible with diffusion-based distillation (e.g., SiD) and generalizes its applicability. Experiments demonstrate that IPFM achieves on-par or superior generation quality to the teacher using only a few function evaluations, significantly outperforming diffusion-limit-based distillation methods in sampling speed.

Technology Category

Application Category

📝 Abstract
Electrostatic generative models such as PFGM++ have recently emerged as a powerful framework, achieving state-of-the-art performance in image synthesis. PFGM++ operates in an extended data space with auxiliary dimensionality $D$, recovering the diffusion model framework as $D oinfty$, while yielding superior empirical results for finite $D$. Like diffusion models, PFGM++ relies on expensive ODE simulations to generate samples, making it computationally costly. To address this, we propose Inverse Poisson Flow Matching (IPFM), a novel distillation framework that accelerates electrostatic generative models across all values of $D$. Our IPFM reformulates distillation as an inverse problem: learning a generator whose induced electrostatic field matches that of the teacher. We derive a tractable training objective for this problem and show that, as $D o infty$, our IPFM closely recovers Score Identity Distillation (SiD), a recent method for distilling diffusion models. Empirically, our IPFM produces distilled generators that achieve near-teacher or even superior sample quality using only a few function evaluations. Moreover, we observe that distillation converges faster for finite $D$ than in the $D o infty$ (diffusion) limit, which is consistent with prior findings that finite-$D$ PFGM++ models exhibit more favorable optimization and sampling properties.
Problem

Research questions and friction points this paper is trying to address.

Accelerating electrostatic generative models' sample generation
Reducing computational cost of ODE simulations
Distilling teacher models into efficient generators
Innovation

Methods, ideas, or system contributions that make the work stand out.

Inverse Poisson Flow Matching accelerates electrostatic generative models
Learns generator matching teacher's electrostatic field
Achieves near-teacher quality with few evaluations
🔎 Similar Papers
No similar papers found.