π€ AI Summary
This work addresses the limited realism and challenge of existing dynamic multi-objective optimization benchmarks, which hinder effective evaluation of algorithmic performance in complex, time-varying environments. To overcome this, the authors propose a systematic framework for constructing more realistic continuous dynamic test problems by integrating a generalized Pareto set variation model, an imbalanced variable contribution mechanism, dynamic rotation matrices, temporal perturbations, and a generalized temporal dependency mechanism. The framework combines generalized mathematical modeling, dynamic variable interactions, non-stationary environment simulation, and history-dependent embedding to significantly enhance benchmark fidelity and complexity. Experimental results demonstrate that the resulting test suite surpasses conventional benchmarks in terms of realism, complexity, and discriminative power, thereby establishing a new standard for research in dynamic multi-objective optimization.
π Abstract
Dynamic multi-objective optimization (DMOO) has recently attracted increasing interest from both academic researchers and engineering practitioners, as numerous real-world applications that evolve over time can be naturally formulated as dynamic multi-objective optimization problems (DMOPs). This growing trend necessitates advanced benchmarks for the rigorous evaluation of optimization algorithms under realistic conditions. This paper introduces a comprehensive and principled framework for constructing highly realistic and challenging DMOO benchmarks. The proposed framework features several novel components: a generalized formulation that allows the Pareto-optimal Set (PS) to change on hypersurfaces, a mechanism for creating controlled variable contribution imbalances to generate heterogeneous landscapes, and dynamic rotation matrices for inducing time-varying variable interactions and non-separability. Furthermore, we incorporate a temporal perturbation mechanism to simulate irregular environmental changes and propose a generalized time-linkage mechanism that systematically embeds historical solution quality into future problems, thereby capturing critical real-world phenomena such as error accumulation and time-deception. Extensive experimental results validate the effectiveness of the proposed framework, demonstrating its superiority over conventional benchmarks in terms of realism, complexity, and its capability for discriminating state-of-the-art algorithmic performance. This work establishes a new standard for dynamic multi-objective optimization benchmarking, providing a powerful tool for the development and evaluation of next-generation algorithms capable of addressing the complexities of real-world dynamic systems.