Soft Quality-Diversity Optimization

๐Ÿ“… 2025-11-30
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Traditional quality-diversity (QD) algorithms rely on explicit discretization of behavior spaces, leading to the curse of dimensionality and severe memory bottlenecks in high-dimensional and large-scale solution spaces. To address this, we propose โ€œSoft QDโ€, a novel framework that eliminates explicit discretization and introduces, for the first time, a formal definition of QD optimization with guaranteed monotonicity and asymptotic equivalence to the standard QD score. Building upon this foundation, we derive SQUADโ€”a fully differentiable algorithm that enforces approximate behavioral diversity as a continuous regularizer, enabling end-to-end gradient-based optimization. Empirical evaluation demonstrates that SQUAD matches state-of-the-art performance on canonical QD benchmarks while achieving substantial improvements in scalability and computational efficiency on high-dimensional tasks.

Technology Category

Application Category

๐Ÿ“ Abstract
Quality-Diversity (QD) algorithms constitute a branch of optimization that is concerned with discovering a diverse and high-quality set of solutions to an optimization problem. Current QD methods commonly maintain diversity by dividing the behavior space into discrete regions, ensuring that solutions are distributed across different parts of the space. The QD problem is then solved by searching for the best solution in each region. This approach to QD optimization poses challenges in large solution spaces, where storing many solutions is impractical, and in high-dimensional behavior spaces, where discretization becomes ineffective due to the curse of dimensionality. We present an alternative framing of the QD problem, called emph{Soft QD}, that sidesteps the need for discretizations. We validate this formulation by demonstrating its desirable properties, such as monotonicity, and by relating its limiting behavior to the widely used QD Score metric. Furthermore, we leverage it to derive a novel differentiable QD algorithm, emph{Soft QD Using Approximated Diversity (SQUAD)}, and demonstrate empirically that it is competitive with current state of the art methods on standard benchmarks while offering better scalability to higher dimensional problems.
Problem

Research questions and friction points this paper is trying to address.

Optimizes diverse high-quality solutions without discretization
Addresses scalability in large high-dimensional solution spaces
Introduces a differentiable algorithm for improved performance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Soft QD sidesteps discretization in behavior spaces
SQUAD algorithm is differentiable and scalable
Soft QD relates to QD Score metric properties
๐Ÿ”Ž Similar Papers
No similar papers found.
S
Saeed Hedayatian
Department of Computer Science, University of Southern California, Los Angeles, CA 90089
Stefanos Nikolaidis
Stefanos Nikolaidis
Associate Professor of Computer Science, University of Southern California
roboticsartificial intelligencemachine learning