Cost-Free Personalization via Information-Geometric Projection in Bayesian Federated Learning

📅 2025-09-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Bayesian federated learning (BFL) faces the dual challenge of achieving personalized modeling under data heterogeneity and strict privacy constraints, while simultaneously preserving both global generalization and local specialization. Method: We propose an information-geometric projection framework that projects the global posterior distribution onto user-specific neighborhoods along geodesics of the statistical manifold, enabling tunable trade-offs between generalization and specialization. This projection is mathematically equivalent to computing a Riemannian centroid on the manifold and yields a closed-form personalized solution—derived for the first time—without additional computational overhead. The method integrates variational inference with an enhanced IVON optimizer, ensuring compatibility with diverse BFL aggregation schemes. Results: Empirical evaluation on heterogeneous benchmarks demonstrates substantial improvements in local model performance at minimal computational cost, while retaining strong global generalization capability.

Technology Category

Application Category

📝 Abstract
Bayesian Federated Learning (BFL) combines uncertainty modeling with decentralized training, enabling the development of personalized and reliable models under data heterogeneity and privacy constraints. Existing approaches typically rely on Markov Chain Monte Carlo (MCMC) sampling or variational inference, often incorporating personalization mechanisms to better adapt to local data distributions. In this work, we propose an information-geometric projection framework for personalization in parametric BFL. By projecting the global model onto a neighborhood of the user's local model, our method enables a tunable trade-off between global generalization and local specialization. Under mild assumptions, we show that this projection step is equivalent to computing a barycenter on the statistical manifold, allowing us to derive closed-form solutions and achieve cost-free personalization. We apply the proposed approach to a variational learning setup using the Improved Variational Online Newton (IVON) optimizer and extend its application to general aggregation schemes in BFL. Empirical evaluations under heterogeneous data distributions confirm that our method effectively balances global and local performance with minimal computational overhead.
Problem

Research questions and friction points this paper is trying to address.

Personalizing Bayesian Federated Learning models
Balancing global generalization with local specialization
Achieving cost-free personalization via information geometry
Innovation

Methods, ideas, or system contributions that make the work stand out.

Information-geometric projection framework
Tunable trade-off generalization specialization
Closed-form solutions cost-free personalization
🔎 Similar Papers
No similar papers found.