CG-FKAN: Compressed-Grid Federated Kolmogorov-Arnold Networks for Communication Constrained Environment

📅 2025-11-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Federated learning (FL) is widely deployed in privacy-sensitive applications, yet its inherent black-box nature limits model interpretability. While Kolmogorov–Arnold networks (KANs) enhance interpretability via learnable spline-based representations, their adaptive grid expansion incurs prohibitive communication overhead in bandwidth-constrained FL settings. To address this, we propose, for the first time, a communication-efficient federated KAN framework featuring grid compression and sparse coefficient selection—preserving spline interpretability while drastically reducing parameter transmission. We design the first FL training protocol explicitly incorporating communication budget constraints and theoretically derive an upper bound on the approximation error induced by grid compression. Extensive experiments demonstrate that, under identical communication budgets, our method reduces RMSE by 13.6% on average compared to fixed-grid KANs, achieving a superior trade-off among model accuracy, interpretability, and communication efficiency.

Technology Category

Application Category

📝 Abstract
Federated learning (FL), widely used in privacy-critical applications, suffers from limited interpretability, whereas Kolmogorov-Arnold Networks (KAN) address this limitation via learnable spline functions. However, existing FL studies applying KAN overlook the communication overhead introduced by grid extension, which is essential for modeling complex functions. In this letter, we propose CG-FKAN, which compresses extended grids by sparsifying and transmitting only essential coefficients under a communication budget. Experiments show that CG-FKAN achieves up to 13.6% lower RMSE than fixed-grid KAN in communication-constrained settings. In addition, we derive a theoretical upper bound on its approximation error.
Problem

Research questions and friction points this paper is trying to address.

Reduces communication overhead in federated KANs
Compresses extended grids under communication constraints
Maintains accuracy while minimizing transmitted coefficients
Innovation

Methods, ideas, or system contributions that make the work stand out.

Compresses extended grids via sparsification
Transmits only essential coefficients under budget
Reduces communication overhead in federated learning
🔎 Similar Papers
No similar papers found.