๐ค AI Summary
Traditional quality-diversity (QD) algorithms rely on explicit discretization of behavior spaces, leading to the curse of dimensionality and severe memory bottlenecks in high-dimensional and large-scale solution spaces. To address this, we propose โSoft QDโ, a novel framework that eliminates explicit discretization and introduces, for the first time, a formal definition of QD optimization with guaranteed monotonicity and asymptotic equivalence to the standard QD score. Building upon this foundation, we derive SQUADโa fully differentiable algorithm that enforces approximate behavioral diversity as a continuous regularizer, enabling end-to-end gradient-based optimization. Empirical evaluation demonstrates that SQUAD matches state-of-the-art performance on canonical QD benchmarks while achieving substantial improvements in scalability and computational efficiency on high-dimensional tasks.
๐ Abstract
Quality-Diversity (QD) algorithms constitute a branch of optimization that is concerned with discovering a diverse and high-quality set of solutions to an optimization problem. Current QD methods commonly maintain diversity by dividing the behavior space into discrete regions, ensuring that solutions are distributed across different parts of the space. The QD problem is then solved by searching for the best solution in each region. This approach to QD optimization poses challenges in large solution spaces, where storing many solutions is impractical, and in high-dimensional behavior spaces, where discretization becomes ineffective due to the curse of dimensionality. We present an alternative framing of the QD problem, called emph{Soft QD}, that sidesteps the need for discretizations. We validate this formulation by demonstrating its desirable properties, such as monotonicity, and by relating its limiting behavior to the widely used QD Score metric. Furthermore, we leverage it to derive a novel differentiable QD algorithm, emph{Soft QD Using Approximated Diversity (SQUAD)}, and demonstrate empirically that it is competitive with current state of the art methods on standard benchmarks while offering better scalability to higher dimensional problems.