SPREAD: Sampling-based Pareto front Refinement via Efficient Adaptive Diffusion

📅 2025-09-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the low convergence efficiency and insufficient diversity of Pareto-optimal solutions in multi-objective optimization (MOO), this paper proposes SPREAD, a generative framework based on Denoising Diffusion Probabilistic Models (DDPMs). SPREAD learns a conditional diffusion process in the decision space; during reverse sampling, it incorporates adaptive multi-gradient descent to accelerate convergence and introduces a Gaussian radial basis function (RBF)-based repulsion term to enhance solution distribution uniformity. The framework supports both offline optimization and Bayesian surrogate-assisted scenarios, offering scalability and robustness. Empirical evaluation across multiple benchmark problems demonstrates that SPREAD consistently outperforms state-of-the-art methods in convergence speed, Pareto front coverage, and capability on large-scale or expensive black-box problems. To our knowledge, this is the first work to systematically integrate diffusion models into MOO, establishing a novel paradigm for efficient and high-quality Pareto set generation.

Technology Category

Application Category

📝 Abstract
Developing efficient multi-objective optimization methods to compute the Pareto set of optimal compromises between conflicting objectives remains a key challenge, especially for large-scale and expensive problems. To bridge this gap, we introduce SPREAD, a generative framework based on Denoising Diffusion Probabilistic Models (DDPMs). SPREAD first learns a conditional diffusion process over points sampled from the decision space and then, at each reverse diffusion step, refines candidates via a sampling scheme that uses an adaptive multiple gradient descent-inspired update for fast convergence alongside a Gaussian RBF-based repulsion term for diversity. Empirical results on multi-objective optimization benchmarks, including offline and Bayesian surrogate-based settings, show that SPREAD matches or exceeds leading baselines in efficiency, scalability, and Pareto front coverage.
Problem

Research questions and friction points this paper is trying to address.

Efficiently computing Pareto sets for large-scale multi-objective optimization
Refining optimal compromises between conflicting objectives using diffusion models
Improving Pareto front coverage and scalability in expensive optimization problems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generative framework using Denoising Diffusion Probabilistic Models
Refines candidates with adaptive multiple gradient descent updates
Ensures diversity via Gaussian RBF-based repulsion term