Pre-Evolved Model for Complex Multi-objective Optimization Problems

📅 2023-12-11
🏛️ arXiv.org
📈 Citations: 3
Influential: 1
📄 PDF
🤖 AI Summary
Existing multi-objective evolutionary algorithms (MOEAs) exhibit weak generalization and poor cross-problem transferability when tackling large-scale, multi-objective, computationally expensive problems (MOPs). Method: We propose the Pre-Evolutionary Model (PEM), the first MOEA pre-evolution paradigm, which meta-trains a Transformer architecture on a large-scale, heterogeneous MOP corpus to learn universal population evolutionary priors. PEM introduces dimension-aware embedding and objective encoding to jointly represent decision and objective spaces, enabling lightweight fine-evolution and online model adaptation for unseen problems. Contribution/Results: Across diverse challenging benchmarks, PEM achieves state-of-the-art performance in convergence, diversity, and computational efficiency. It significantly enhances cross-problem generalization capability and practical deployability, offering a scalable, adaptive framework for real-world complex MOPs.
📝 Abstract
Multi-objective optimization problems (MOPs) necessitate the simultaneous optimization of multiple objectives. Numerous studies have demonstrated that evolutionary computation is a promising paradigm for solving complex MOPs, which involve optimization problems with large-scale decision variables, many objectives, and expensive evaluation functions. However, existing multi-objective evolutionary algorithms (MOEAs) encounter significant challenges in generating high-quality populations when solving diverse complex MOPs. Specifically, the distinct requirements and constraints of the population result in the inefficiency or even incompetence of MOEAs in addressing various complex MOPs. Therefore, this paper proposes the concept of pre-evolving for MOEAs to generate high-quality populations for diverse complex MOPs. Drawing inspiration from the classical transformer architecture, we devise dimension embedding and objective encoding techniques to configure the pre-evolved model (PEM). The PEM is pre-evolved on a substantial number of existing MOPs. Subsequently, when fine-evolving on new complex MOPs, the PEM transforms the population into the next generation to approximate the Pareto-optimal front. Furthermore, it utilizes evaluations on new solutions to iteratively update the PEM for subsequent generations, thereby efficiently solving various complex MOPs. Experimental results demonstrate that the PEM outperforms state-of-the-art MOEAs on a range of complex MOPs.
Problem

Research questions and friction points this paper is trying to address.

Enhancing generalization across diverse multi-objective optimization problem classes
Scaling evolutionary computation to handle high-dimensional decision spaces efficiently
Transferring historical optimization knowledge through unified population pre-training framework
Innovation

Methods, ideas, or system contributions that make the work stand out.

Population Pre-trained Model leverages historical optimization knowledge
Transformer embeds diverse decision spaces into common latent space
Objective fusion enhances prediction accuracy for complex optimization
🔎 Similar Papers
No similar papers found.
Haokai Hong
Haokai Hong
The Hong Kong Polytechnic University
AI for sciencescience for AI.
M
Min Jiang
Department of Artificial Intelligence, Key Laboratory of Digital Protection and Intelligent Processing of Intangible Cultural Heritage of Fujian and Taiwan, Ministry of Culture and Tourism, School of Informatics, Xiamen University, Fujian, China, 361005