🤖 AI Summary
This work addresses the challenges of resource constraints and data heterogeneity in federated learning (FL) within 6G space-air-ground integrated networks by proposing a hierarchical split FL framework. The framework jointly optimizes device association, selection of model splitting layers, and resource allocation to minimize a weighted sum of training loss and latency. By deriving an upper bound on the loss function under this framework, the complex joint optimization problem is decomposed into tractable subproblems. An iterative algorithm is developed, incorporating an efficient search strategy for model splitting points to co-optimize device association and resource allocation. Experimental results demonstrate that the proposed method significantly improves training efficiency while maintaining model accuracy, effectively balancing communication, computation, and learning performance in space-air-ground integrated scenarios.
📝 Abstract
6G facilitates deployment of Federated Learning (FL) in the Space-Air-Ground Integrated Network (SAGIN), yet FL confronts challenges such as resource constrained and unbalanced data distribution. To address these issues, this paper proposes a Hierarchical Split Federated Learning (HSFL) framework and derives its upper bound of loss function. To minimize the weighted sum of training loss and latency, we formulate a joint optimization problem that integrates device association, model split layer selection, and resource allocation. We decompose the original problem into several subproblems, where an iterative optimization algorithm for device association and resource allocation based on brute-force split point search is proposed. Simulation results demonstrate that the proposed algorithm can effectively balance training efficiency and model accuracy for FL in SAGIN.