Clustering-Based Evolutionary Federated Multiobjective Optimization and Learning

📅 2025-04-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Addressing the multi-objective optimization challenge in federated learning—where communication efficiency, model accuracy, and privacy preservation are difficult to jointly optimize—this paper proposes FedMOEAC. The framework introduces a novel clustering-enhanced evolutionary mechanism to sustain population diversity and is the first to synergistically integrate model quantization, weight sparsification, and a differentially private stochastic gradient descent (DP-SGD) variant into an improved NSGA-II framework for Pareto-optimal joint optimization of all three objectives. Evaluations on MNIST and CIFAR-10 demonstrate a model accuracy of 98.2%, a 45% reduction in communication overhead, a privacy budget ε < 1.0, and a 33% acceleration in convergence compared to the baseline NSGA-II. FedMOEAC establishes a scalable, multi-objective optimization paradigm for efficient and privacy-preserving federated modeling in privacy-sensitive applications.

Technology Category

Application Category

📝 Abstract
Federated learning enables decentralized model training while preserving data privacy, yet it faces challenges in balancing communication efficiency, model performance, and privacy protection. To address these trade-offs, we formulate FL as a federated multiobjective optimization problem and propose FedMOEAC, a clustering-based evolutionary algorithm that efficiently navigates the Pareto-optimal solution space. Our approach integrates quantization, weight sparsification, and differential privacy to reduce communication overhead while ensuring model robustness and privacy. The clustering mechanism en-hances population diversity, preventing premature convergence and improving optimization efficiency. Experimental results on MNIST and CIFAR-10 demonstrate that FedMOEAC achieves 98.2% accuracy, reduces communication overhead by 45%, and maintains a privacy budget below 1.0, outperforming NSGA-II in convergence speed by 33%. This work provides a scalable and efficient FL framework, ensuring an optimal balance between accuracy, communication efficiency, and privacy in resource-constrained environments.
Problem

Research questions and friction points this paper is trying to address.

Balancing communication efficiency, model performance, and privacy in federated learning
Formulating FL as a federated multiobjective optimization problem
Reducing communication overhead while ensuring model robustness and privacy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Clustering-based evolutionary algorithm for multiobjective optimization
Integrates quantization, sparsification, and differential privacy
Enhances diversity and prevents premature convergence
🔎 Similar Papers
No similar papers found.
C
Chengui Xiao
College of Computer and Software Engineering, Shenzhen University, Shenzhen, China
Songbai Liu
Songbai Liu
Shenzhen University
Evolutionary ComputationFederated LearningEvolutionary Transfer OptimizationMultiobjective Optimization