Who to Trust? Aggregating Client Knowledge in Logit-Based Federated Learning

📅 2025-09-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenges of knowledge aggregation across heterogeneous clients, high communication overhead, and poor robustness under non-IID data in logit-based federated learning, this paper proposes a lightweight logit-sharing and adaptive aggregation framework. The method eliminates model parameter transmission by generating comparable logits on a shared proxy dataset. It further introduces three aggregation strategies—simple averaging, uncertainty-weighted fusion, and a learnable meta-aggregation network—to dynamically model client heterogeneity. Extensive experiments on MNIST and CIFAR-10 demonstrate that the proposed approach achieves accuracy comparable to centralized training under non-IID settings, reduces communication cost by an order of magnitude, and significantly outperforms baseline methods. These results validate the framework’s effectiveness, robustness, and generalization capability in practical federated learning scenarios.

Technology Category

Application Category

📝 Abstract
Federated learning (FL) usually shares model weights or gradients, which is costly for large models. Logit-based FL reduces this cost by sharing only logits computed on a public proxy dataset. However, aggregating information from heterogeneous clients is still challenging. This paper studies this problem, introduces and compares three logit aggregation methods: simple averaging, uncertainty-weighted averaging, and a learned meta-aggregator. Evaluated on MNIST and CIFAR-10, these methods reduce communication overhead, improve robustness under non-IID data, and achieve accuracy competitive with centralized training.
Problem

Research questions and friction points this paper is trying to address.

Aggregating knowledge from heterogeneous clients in federated learning
Reducing communication costs by sharing logits instead of weights
Improving robustness and accuracy under non-IID data distributions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Logit-based FL reduces communication cost
Three aggregation methods compared for robustness
Achieves competitive accuracy with centralized training
🔎 Similar Papers
No similar papers found.
V
Viktor Kovalchuk
Mohamed bin Zayed University of Artificial Intelligence, United Arab Emirates
Nikita Kotelevskii
Nikita Kotelevskii
Mohamed bin Zayed University of Artificial Intelligence (MBZUAI)
Uncertainty quantificationVariational InferenceMCMCFederated Learning
Maxim Panov
Maxim Panov
Assistant Professor at Mohamed bin Zayed University of Artificial Intelligence (MBZUAI)
Machine LearningStatistics
S
Samuel Horváth
Mohamed bin Zayed University of Artificial Intelligence, United Arab Emirates
M
Martin Takáč
Mohamed bin Zayed University of Artificial Intelligence, United Arab Emirates