Hardest Monotone Functions for Evolutionary Algorithms

📅 2023-11-13
🏛️ EvoStar Conferences
📈 Citations: 2
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the class of dynamic monotonic functions and identifies a fundamental gap in understanding worst-case behavior of evolutionary algorithms (EAs) in dynamic environments. Method: We explicitly construct the first adversarial benchmark function—Switching Dynamic BinVal (SDBV)—tailored to the (1+1) EA. SDBV integrates dynamic fitness modeling with a drift-minimization mechanism, deliberately engineered to decouple drift minimization from runtime maximization. Contribution/Results: Through rigorous theoretical analysis combining combinatorial arguments and drift theorems, we establish a tight bound of Θ(n^{3/2}) on the expected optimization time of the (1+1) EA on SDBV. Empirical simulations confirm that SDBV induces worst-case behavior not only for the (1+1) EA but also for adaptive variants including the (1,λ)-EA and (1+λ)-EA. To our knowledge, this is the first parameterized pessimistic instance within the PO-EA framework, offering a novel paradigm for characterizing computational complexity of dynamic function classes.
📝 Abstract
The study of hardest and easiest fitness landscapes is an active area of research. Recently, Kaufmann, Larcher, Lengler and Zou conjectured that for the self-adjusting $(1,lambda)$-EA, Adversarial Dynamic BinVal (ADBV) is the hardest dynamic monotone function to optimize. We introduce the function Switching Dynamic BinVal (SDBV) which coincides with ADBV whenever the number of remaining zeros in the search point is strictly less than $n/2$, where $n$ denotes the dimension of the search space. We show, using a combinatorial argument, that for the $(1+1)$-EA with any mutation rate $p in [0,1]$, SDBV is drift-minimizing among the class of dynamic monotone functions. Our construction provides the first explicit example of an instance of the partially-ordered evolutionary algorithm (PO-EA) model with parameterized pessimism introduced by Colin, Doerr and F'erey, building on work of Jansen. We further show that the $(1+1)$-EA optimizes SDBV in $Theta(n^{3/2})$ generations. Our simulations demonstrate matching runtimes for both static and self-adjusting $(1,lambda)$ and $(1+lambda)$-EA. We further show, using an example of fixed dimension, that drift-minimization does not equal maximal runtime.
Problem

Research questions and friction points this paper is trying to address.

Analyzing runtime complexity for evolutionary algorithms on dynamic monotone functions
Introducing Switching Dynamic BinVal as a drift-minimizing adversarial function
Proving Θ(n³/²) generations required for (1+1)-EA optimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Introduced Switching Dynamic BinVal function
Used combinatorial argument for drift minimization
Proved runtime bound with dynamic monotone functions
Marc Kaufmann
Marc Kaufmann
Department of Computer Science, ETH Zürich, Switzerland
M
Maxime Larcher
Department of Computer Science, ETH Zürich, Switzerland
Johannes Lengler
Johannes Lengler
ETH Zürich
O
Oliver Sieberling
Department of Computer Science, ETH Zürich, Switzerland