🤖 AI Summary
To address feature representation divergence and loss of global topological structure in federated learning under non-IID data, this paper proposes FedTopo. Methodologically, it introduces (1) a topology-guided block selection mechanism that automatically identifies the most topologically discriminative network modules via persistent homology, and (2) a topology alignment loss function that dynamically aligns high-level semantic topological embeddings during local client training, thereby ensuring cross-device representation consistency. Evaluated on Fashion-MNIST, CIFAR-10, and CIFAR-100 under four distinct non-IID settings, FedTopo consistently achieves higher model accuracy and faster convergence than mainstream baselines—including FedAvg and FedProx. Its key contribution lies in being the first framework to systematically integrate topological consistency modeling into federated representation alignment, establishing a principled approach for preserving structural semantics across decentralized, heterogeneous clients.
📝 Abstract
Current federated-learning models deteriorate under heterogeneous (non-I.I.D.) client data, as their feature representations diverge and pixel- or patch-level objectives fail to capture the global topology which is essential for high-dimensional visual tasks. We propose FedTopo, a framework that integrates Topological-Guided Block Screening (TGBS) and Topological Embedding (TE) to leverage topological information, yielding coherently aligned cross-client representations by Topological Alignment Loss (TAL). First, Topology-Guided Block Screening (TGBS) automatically selects the most topology-informative block, i.e., the one with maximal topological separability, whose persistence-based signatures best distinguish within- versus between-class pairs, ensuring that subsequent analysis focuses on topology-rich features. Next, this block yields a compact Topological Embedding, which quantifies the topological information for each client. Finally, a Topological Alignment Loss (TAL) guides clients to maintain topological consistency with the global model during optimization, reducing representation drift across rounds. Experiments on Fashion-MNIST, CIFAR-10, and CIFAR-100 under four non-I.I.D. partitions show that FedTopo accelerates convergence and improves accuracy over strong baselines.