NAWOA-XGBoost: A Novel Model for Early Prediction of Academic Potential in Computer Science Students

📅 2025-12-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the premature convergence and slow convergence issues of the Whale Optimization Algorithm (WOA) in XGBoost hyperparameter optimization, this paper proposes a Nonlinear Adaptive Whale Optimization Algorithm (NAWOA). NAWOA integrates five key enhancements: Good Nodes Set-based initialization, a Leader-Followers Foraging mechanism, a dynamic prey-encircling strategy, a triangular cooperative hunting structure, and a nonlinear convergence factor—collectively strengthening global exploration and convergence stability. We embed NAWOA into XGBoost hyperparameter tuning to construct the NAWOA-XGBoost model, applied to early academic potential prediction for students—a multiclass, class-imbalanced educational task. Evaluated on a real-world dataset of 495 students, the model achieves accuracy = 0.8148, macro-F1 = 0.8101, AUC = 0.8932, and G-mean = 0.8172, consistently outperforming standard XGBoost and WOA-XGBoost. Results validate both the effectiveness and practical applicability of the proposed algorithmic improvements.

Technology Category

Application Category

📝 Abstract
Whale Optimization Algorithm (WOA) suffers from limited global search ability, slow convergence, and tendency to fall into local optima, restricting its effectiveness in hyperparameter optimization for machine learning models. To address these issues, this study proposes a Nonlinear Adaptive Whale Optimization Algorithm (NAWOA), which integrates strategies such as Good Nodes Set initialization, Leader-Followers Foraging, Dynamic Encircling Prey, Triangular Hunting, and a nonlinear convergence factor to enhance exploration, exploitation, and convergence stability. Experiments on 23 benchmark functions demonstrate NAWOA's superior optimization capability and robustness. Based on this optimizer, an NAWOA-XGBoost model was developed to predict academic potential using data from 495 Computer Science undergraduates at Macao Polytechnic University (2009-2019). Results show that NAWOA-XGBoost outperforms traditional XGBoost and WOA-XGBoost across key metrics, including Accuracy (0.8148), Macro F1 (0.8101), AUC (0.8932), and G-Mean (0.8172), demonstrating strong adaptability on multi-class imbalanced datasets.
Problem

Research questions and friction points this paper is trying to address.

Enhances Whale Optimization Algorithm's global search and convergence
Optimizes XGBoost hyperparameters for academic potential prediction
Addresses multi-class imbalanced datasets in student performance forecasting
Innovation

Methods, ideas, or system contributions that make the work stand out.

Enhanced WOA with nonlinear adaptive strategies for optimization
Integrated Good Nodes Set and dynamic foraging for stability
Applied NAWOA to optimize XGBoost for academic prediction
🔎 Similar Papers
No similar papers found.
J
Junhao Wei
Faculty of Applied Sciences, Macao Polytechnic University, Macao, China
Y
Yanzhao Gu
Faculty of Applied Sciences, Macao Polytechnic University, Macao, China
R
Ran Zhang
Faculty of Applied Sciences, Macao Polytechnic University, Macao, China
M
Mingjing Huang
Faculty of Applied Sciences, Macao Polytechnic University, Macao, China
J
Jinhong Song
Faculty of Applied Sciences, Macao Polytechnic University, Macao, China
Yanxiao Li
Yanxiao Li
National Energy Technology Laboratory
Wenxuan Zhu
Wenxuan Zhu
MS/PhD KAUST
Y
Yapeng Wang
Faculty of Applied Sciences, Macao Polytechnic University, Macao, China
Zikun Li
Zikun Li
Carnegie Mellon University
Computer Systems
Zhiwen Wang
Zhiwen Wang
Phd, Sichuan University;
Continual learningImage processingMRIInverse problem
X
Xu Yang
Faculty of Applied Sciences, Macao Polytechnic University, Macao, China
N
Ngai Cheong
Faculty of Applied Sciences, Macao Polytechnic University, Macao, China