Decentralized Federated Learning of Probabilistic Generative Classifiers

📅 2025-07-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses collaborative training of probabilistic generative classifiers in decentralized federated learning, where nodes train a globally consistent generative classifier without sharing raw data—exchanging only local sufficient statistics instead of model parameters. We propose a serverless distributed message-passing framework featuring a sufficient-statistics-based aggregation mechanism and generative-model-specific parameter update rules, enabling convergence under arbitrary network topologies and non-IID data distributions. We provide theoretical guarantees of global convergence for the proposed algorithm. Empirical evaluations across diverse network structures, data heterogeneity levels, and system scales demonstrate that our method achieves performance on par with centralized training and significantly outperforms existing decentralized federated learning approaches.

Technology Category

Application Category

📝 Abstract
Federated learning is a paradigm of increasing relevance in real world applications, aimed at building a global model across a network of heterogeneous users without requiring the sharing of private data. We focus on model learning over decentralized architectures, where users collaborate directly to update the global model without relying on a central server. In this context, the current paper proposes a novel approach to collaboratively learn probabilistic generative classifiers with a parametric form. The framework is composed by a communication network over a set of local nodes, each of one having its own local data, and a local updating rule. The proposal involves sharing local statistics with neighboring nodes, where each node aggregates the neighbors' information and iteratively learns its own local classifier, which progressively converges to a global model. Extensive experiments demonstrate that the algorithm consistently converges to a globally competitive model across a wide range of network topologies, network sizes, local dataset sizes, and extreme non-i.i.d. data distributions.
Problem

Research questions and friction points this paper is trying to address.

Decentralized federated learning for probabilistic generative classifiers
Collaborative model learning without central server dependency
Handling heterogeneous non-i.i.d. data across distributed nodes
Innovation

Methods, ideas, or system contributions that make the work stand out.

Decentralized federated learning without central server
Collaborative probabilistic generative classifiers learning
Local statistics sharing and neighbor aggregation
🔎 Similar Papers
No similar papers found.
Aritz Pérez
Aritz Pérez
Basque Center for Applied Mathematics (BCAM)
Aritificial IntelligenceMachine LearningProbabilistic Graphical Models
C
Carlos Echegoyen
Spatial Statistics Group and INAMAT2, Public University of Navarre, 31006 Pamplona, Spain
G
Guzmán Santafé
Spatial Statistics Group and INAMAT2, Public University of Navarre, 31006 Pamplona, Spain