Decentralized Personalization for Federated Medical Image Segmentation via Gossip Contrastive Mutual Learning

📅 2025-03-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address two critical challenges in federated learning (FL) for medical image segmentation—single-point failure of the central server and poor personalization due to cross-site data heterogeneity—this paper proposes the first decentralized personalized FL framework integrating Gossip-based peer-to-peer communication with Deep Contrastive Mutual Learning (DCML). Departing from global model reliance, our approach enables robust and efficient local model optimization via lightweight pairwise exchanges and unsupervised contrastive collaboration. Evaluated on three public medical image segmentation benchmarks, it consistently outperforms state-of-the-art centralized and decentralized FL baselines: Dice scores improve by 2.1–4.7 percentage points, while communication overhead decreases by over 40%. The core contribution is the establishment of the first coordination-free, decentralized learning paradigm for medical segmentation that simultaneously ensures robustness and model personalization.

Technology Category

Application Category

📝 Abstract
Federated Learning (FL) presents a promising avenue for collaborative model training among medical centers, facilitating knowledge exchange without compromising data privacy. However, vanilla FL is prone to server failures and rarely achieves optimal performance on all participating sites due to heterogeneous data distributions among them. To overcome these challenges, we propose Gossip Contrastive Mutual Learning (GCML), a unified framework to optimize personalized models in a decentralized environment, where Gossip Protocol is employed for flexible and robust peer-to-peer communication. To make efficient and reliable knowledge exchange in each communication without the global knowledge across all the sites, we introduce deep contrast mutual learning (DCML), a simple yet effective scheme to encourage knowledge transfer between the incoming and local models through collaborative training on local data. By integrating DCML with other efforts to optimize site-specific models by leveraging useful information from peers, we evaluated the performance and efficiency of the proposed method on three publicly available datasets with different segmentation tasks. Our extensive experimental results show that the proposed GCML framework outperformed both centralized and decentralized FL methods with significantly reduced communication overhead, indicating its potential for real-world deployment.
Problem

Research questions and friction points this paper is trying to address.

Overcomes server failures in Federated Learning
Addresses heterogeneous data distribution in medical centers
Enhances personalized model performance in decentralized environments
Innovation

Methods, ideas, or system contributions that make the work stand out.

Gossip Protocol enables decentralized peer-to-peer communication
Deep Contrast Mutual Learning facilitates local knowledge transfer
GCML optimizes personalized models with reduced communication overhead
🔎 Similar Papers
No similar papers found.