🤖 AI Summary
To address the limitations of single metaheuristic algorithms—such as premature convergence and slow convergence—in high-dimensional complex function optimization, this paper proposes three hybrid PSO-GA paradigms: sequential, parallel, and continuous. A novel continuous hybrid PSO-GA algorithm is specifically designed, featuring an explicit information transfer mechanism that incorporates PSO’s velocity vectors and personal best positions into GA’s mutation and selection operators, thereby enhancing population diversity and search continuity. Experimental evaluation on high-dimensional benchmark functions (e.g., Ackley, Rastrigin) demonstrates that the proposed method significantly outperforms standard PSO and GA in both convergence speed and stability. Notably, its superiority becomes more pronounced in ultra-high-dimensional settings (≥100D), validating the effectiveness and robustness of cross-paradigm synergistic optimization.
📝 Abstract
The goal of this paper is twofold. First, it explores hybrid evolutionary-swarm metaheuristics that combine the features of PSO and GA in a sequential, parallel and consecutive manner in comparison with their standard basic form: Genetic Algorithm and Particle Swarm Optimization. The algorithms were tested on a set of benchmark functions, including Ackley, Griewank, Levy, Michalewicz, Rastrigin, Schwefel, and Shifted Rotated Weierstrass, across multiple dimensions. The experimental results demonstrate that the hybrid approaches achieve superior convergence and consistency, especially in higher-dimensional search spaces. The second goal of this paper is to introduce a novel consecutive hybrid PSO-GA evolutionary algorithm that ensures continuity between PSO and GA steps through explicit information transfer mechanisms, specifically by modifying GA's variation operators to inherit velocity and personal best information.