Implicit Dynamical Flow Fusion (IDFF) for Generative Modeling

๐Ÿ“… 2024-09-22
๐Ÿ›๏ธ arXiv.org
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Conditional Flow Matching (CFM) suffers from low sampling efficiency, requiring hundreds of function evaluations (NFEs). To address this, we propose an Implicit Dynamical Flow Fusion frameworkโ€”the first to integrate implicit ordinary differential equation (ODE) solvers with flow matching. Our method introduces a momentum-augmented learnable vector field, enabling stable large-step integration while preserving distribution fidelity. This breaks CFMโ€™s reliance on high NFEs, achieving up to 90% NFE reduction (10ร— speedup) without sacrificing quality: it matches CFM and diffusion models in FID and log-likelihood on CIFAR-10 and CelebA. Moreover, it significantly outperforms baselines in molecular dynamics and sea surface temperature time-series modeling. Our core contributions are threefold: (i) the first unification of implicit dynamics and flow matching; (ii) momentum-driven continuous vector field modeling; and (iii) a unified framework for efficient, high-fidelity generative sampling.

Technology Category

Application Category

๐Ÿ“ Abstract
Conditional Flow Matching (CFM) models can generate high-quality samples from a non-informative prior, but they can be slow, often needing hundreds of network evaluations (NFE). To address this, we propose Implicit Dynamical Flow Fusion (IDFF); IDFF learns a new vector field with an additional momentum term that enables taking longer steps during sample generation while maintaining the fidelity of the generated distribution. Consequently, IDFFs reduce the NFEs by a factor of ten (relative to CFMs) without sacrificing sample quality, enabling rapid sampling and efficient handling of image and time-series data generation tasks. We evaluate IDFF on standard benchmarks such as CIFAR-10 and CelebA for image generation, where we achieve likelihood and quality performance comparable to CFMs and diffusion-based models with fewer NFEs. IDFF also shows superior performance on time-series datasets modeling, including molecular simulation and sea surface temperature (SST) datasets, highlighting its versatility and effectiveness across different domains.href{https://github.com/MrRezaeiUofT/IDFF}{Github Repository}
Problem

Research questions and friction points this paper is trying to address.

Improves slow sample generation in Conditional Flow Matching models
Reduces network evaluations without sacrificing sample quality
Enhances performance in image and time-series data generation
Innovation

Methods, ideas, or system contributions that make the work stand out.

IDFF introduces momentum term for faster sampling
Reduces network evaluations by tenfold
Maintains sample quality across diverse datasets
๐Ÿ”Ž Similar Papers
No similar papers found.
Mohammad R. Rezaei
Mohammad R. Rezaei
University of Toronto
Generative Modelstime-seriesLLMsAgentsProbabilistic Modeling
Rahul G. Krishnan
Rahul G. Krishnan
University of Toronto
Machine LearningArtificial IntelligenceHealthcareProbabilistic modelsCausal Inference
M
Milos R. Popovic
Department of Biomedical Engineering, University of Toronto; KITE Research Institute, Toronto Rehabilitation Institute
M
M. Lankarany
Department of Biomedical Engineering, University of Toronto; KITE Research Institute, Toronto Rehabilitation Institute