Measuring Participant Contributions in Decentralized Federated Learning

📅 2025-05-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Addressing the critical lack of trustworthy and adaptive participant contribution quantification mechanisms in decentralized federated learning (DFL), this paper proposes the first DFL-specific contribution evaluation framework. Grounded in Shapley value theory, we introduce DFL-Shapley—the first contribution metric explicitly designed for DFL—and develop DFL-MR, an efficient, scalable approximation algorithm. DFL-Shapley jointly models round-level local contribution accumulation and dynamic model exchange topologies, enabling distributed contribution estimation under asynchronous and time-varying network conditions. Experiments demonstrate that DFL-Shapley establishes an effective ground-truth benchmark; DFL-MR achieves high fidelity across diverse DFL settings (mean absolute error < 4.2%) and significantly outperforms existing methods adapted from centralized federated learning.

Technology Category

Application Category

📝 Abstract
Federated learning (FL) enables multiple clients to collaboratively train models without sharing their data. Measuring participant contributions in FL is crucial for incentivizing clients and ensuring transparency. While various methods have been proposed for contribution measurement, they are designed exclusively for centralized federated learning (CFL), where a central server collects and aggregates client models, along with evaluating their contributions. Meanwhile, decentralized federated learning (DFL), in which clients exchange models directly without a central server, has gained significant attention for mitigating communication bottlenecks and eliminating a single point of failure. However, applying existing contribution measurement methods to DFL is challenging due to the presence of multiple global models and the absence of a central server. In this study, we present novel methodologies for measuring participant contributions in DFL. We first propose DFL-Shapley, an extension of the Shapley value tailored for DFL, adapting this widely used CFL metric to decentralized settings. Given the impracticality of computing the ideal DFL-Shapley in real-world systems, we introduce DFL-MR, a computable approximation that estimates overall contributions by accumulating round-wise Shapley values. We evaluate DFL-Shapley and DFL-MR across various FL scenarios and compare them with existing CFL metrics. The experimental results confirm DFL-Shapley as a valid ground-truth metric and demonstrate DFL-MR's proximity to DFL-Shapley across various settings, highlighting their effectiveness as contribution metrics in DFL.
Problem

Research questions and friction points this paper is trying to address.

Measuring participant contributions in decentralized federated learning
Extending Shapley value for decentralized settings (DFL-Shapley)
Proposing computable approximation DFL-MR for real-world systems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Extends Shapley value for decentralized FL
Introduces DFL-MR for practical contribution estimation
Validates metrics across diverse FL scenarios
🔎 Similar Papers
No similar papers found.