Multitask Learning with Stochastic Interpolants

📅 2025-08-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of developing a general-purpose generative model that achieves zero-shot generalization across diverse tasks without task-specific training, by unifying probabilistic distribution modeling across heterogeneous dimensional spaces—vectors, matrices, and operators. Method: We propose a generalized stochastic interpolation framework that elevates the conventional scalar time variable to an operator-valued variable, enabling differentiable and invertible probability flow mapping between heterogeneous spaces. A multi-task joint learning architecture is further introduced to support zero-shot cross-task transfer. Contribution/Results: This is the first systematic extension of stochastic interpolation to operator space, unifying theoretical foundations of diffusion models, flow matching, and related paradigms. Experiments demonstrate effective zero-shot generation across conditional synthesis, image inpainting, posterior sampling, and fine-tuning-free inference—exhibiting strong multi-scale representation capability and precise distribution alignment.

Technology Category

Application Category

📝 Abstract
We propose a framework for learning maps between probability distributions that broadly generalizes the time dynamics of flow and diffusion models. To enable this, we generalize stochastic interpolants by replacing the scalar time variable with vectors, matrices, or linear operators, allowing us to bridge probability distributions across multiple dimensional spaces. This approach enables the construction of versatile generative models capable of fulfilling multiple tasks without task-specific training. Our operator-based interpolants not only provide a unifying theoretical perspective for existing generative models but also extend their capabilities. Through numerical experiments, we demonstrate the zero-shot efficacy of our method on conditional generation and inpainting, fine-tuning and posterior sampling, and multiscale modeling, suggesting its potential as a generic task-agnostic alternative to specialized models.
Problem

Research questions and friction points this paper is trying to address.

Generalize flow and diffusion models for multitask learning
Bridge probability distributions across multiple dimensional spaces
Enable versatile generative models without task-specific training
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generalized stochastic interpolants with multi-dimensional operators
Unified generative models for multiple tasks
Zero-shot efficacy in diverse applications
🔎 Similar Papers
No similar papers found.
H
Hugo Negrel
Capital Fund Management, 23 Rue de l’Université, 75007 Paris
F
Florentin Coeurdoux
Capital Fund Management, 23 Rue de l’Université, 75007 Paris
Michael S. Albergo
Michael S. Albergo
Harvard University
Eric Vanden-Eijnden
Eric Vanden-Eijnden
Courant Institute of Mathematical Sciences NYU
Applied and computational mathematics