FedBEns: One-Shot Federated Learning based on Bayesian Ensemble

๐Ÿ“… 2025-03-19
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
To address the limited generalization capability of global models in one-shot federated learning (FL) under single-round communication, this paper proposes a multimodal Bayesian ensemble method. It models the multimodality of client-wise local losses from a Bayesian perspective, constructs a mixture-of-posteriors distribution for each client via Laplace approximation, and infers the global model through weighted aggregation of these local posterior mixtures at the server. This work is the first to introduce multimodal posterior modeling into one-shot FL, overcoming the limitations of conventional unimodal approximations. Extensive experiments on multiple benchmark datasets demonstrate that the proposed method significantly outperforms state-of-the-art one-shot FL algorithms, achieving average test accuracy improvements of 2.3โ€“5.1 percentage points. The results validate that multimodal posterior modeling plays a critical role in enhancing the robustness and generalization performance of the global model.

Technology Category

Application Category

๐Ÿ“ Abstract
One-Shot Federated Learning (FL) is a recent paradigm that enables multiple clients to cooperatively learn a global model in a single round of communication with a central server. In this paper, we analyze the One-Shot FL problem through the lens of Bayesian inference and propose FedBEns, an algorithm that leverages the inherent multimodality of local loss functions to find better global models. Our algorithm leverages a mixture of Laplace approximations for the clients' local posteriors, which the server then aggregates to infer the global model. We conduct extensive experiments on various datasets, demonstrating that the proposed method outperforms competing baselines that typically rely on unimodal approximations of the local losses.
Problem

Research questions and friction points this paper is trying to address.

One-Shot Federated Learning for global model
Bayesian inference to improve model accuracy
Mixture of Laplace approximations for aggregation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bayesian inference for One-Shot FL
Mixture of Laplace approximations
Aggregates local posteriors globally
๐Ÿ”Ž Similar Papers
No similar papers found.