GAS: Improving Discretization of Diffusion ODEs via Generalized Adversarial Solver

📅 2025-10-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Diffusion models achieve high-fidelity generation but suffer from substantial computational overhead due to multi-step ODE sampling. While existing gradient-based solvers reduce function evaluations, they rely on intricate training strategies and often fail to preserve fine-grained details. To address this, we propose the Generalized Adversarial Solver (GAS), which simplifies the parameterization of ODE solvers, incorporates a distillation loss, and introduces discriminator-guided adversarial training—without requiring auxiliary training techniques. GAS effectively suppresses artifacts and enhances detail recovery. Experiments demonstrate that, under comparable computational budgets, GAS achieves superior generation quality over state-of-the-art solver training methods using fewer sampling steps—particularly improving texture sharpness and structural fidelity.

Technology Category

Application Category

📝 Abstract
While diffusion models achieve state-of-the-art generation quality, they still suffer from computationally expensive sampling. Recent works address this issue with gradient-based optimization methods that distill a few-step ODE diffusion solver from the full sampling process, reducing the number of function evaluations from dozens to just a few. However, these approaches often rely on intricate training techniques and do not explicitly focus on preserving fine-grained details. In this paper, we introduce the Generalized Solver: a simple parameterization of the ODE sampler that does not require additional training tricks and improves quality over existing approaches. We further combine the original distillation loss with adversarial training, which mitigates artifacts and enhances detail fidelity. We call the resulting method the Generalized Adversarial Solver and demonstrate its superior performance compared to existing solver training methods under similar resource constraints. Code is available at https://github.com/3145tttt/GAS.
Problem

Research questions and friction points this paper is trying to address.

Reducing computational cost of diffusion model sampling
Improving detail preservation in distilled ODE solvers
Mitigating artifacts through adversarial training techniques
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generalized Solver parameterizes ODE sampler without training tricks
Combines distillation loss with adversarial training for detail fidelity
Mitigates artifacts and enhances performance under resource constraints
🔎 Similar Papers
No similar papers found.
A
Aleksandr Oganov
HSE University, Russia
I
Ilya Bykov
HSE University, Russia
E
Eva Neudachina
HSE University, Russia
M
Mishan Aliev
HSE University, Russia
A
Alexander Tolmachev
HSE University, Russia
A
Alexander Sidorov
HSE University, Russia
A
Aleksandr Zuev
HSE University, Russia
A
Andrey Okhotin
HSE University, Russia
Denis Rakitin
Denis Rakitin
HSE University
Bayesian methodsDeep learningProbabilistic modeling
Aibek Alanov
Aibek Alanov
Higher School of Economics
bayesian methodsgenerative models