Tackling Feature and Sample Heterogeneity in Decentralized Multi-Task Learning: A Sheaf-Theoretic Approach

๐Ÿ“… 2025-02-03
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
To address the challenges of complex task interdependencies, strong data heterogeneity, and privacy sensitivity in decentralized federated multi-task learning (FMTL), this paper introduces cellular sheaf theoryโ€”the first such application in FMTLโ€”to establish a sheaf-based paradigm for modeling task relationships via layered structures. We propose sheaf Laplacian regularization, which jointly characterizes cross-client task correlations and feature/sample heterogeneity, providing a unified theoretical framework for diverse FL/FMTL architectures. We theoretically prove sublinear convergence of the proposed algorithm. Empirical evaluations demonstrate significantly reduced communication overhead and superior performance over state-of-the-art decentralized FMTL methods across multiple heterogeneous benchmarks. The core innovation lies in the deep integration of sheaf theory with distributed optimization, enabling efficient, privacy-preserving collaborative modeling without centralized coordination.

Technology Category

Application Category

๐Ÿ“ Abstract
Federated multi-task learning (FMTL) aims to simultaneously learn multiple related tasks across clients without sharing sensitive raw data. However, in the decentralized setting, existing FMTL frameworks are limited in their ability to capture complex task relationships and handle feature and sample heterogeneity across clients. To address these challenges, we introduce a novel sheaf-theoretic-based approach for FMTL. By representing client relationships using cellular sheaves, our framework can flexibly model interactions between heterogeneous client models. We formulate the sheaf-based FMTL optimization problem using sheaf Laplacian regularization and propose the Sheaf-FMTL algorithm to solve it. We show that the proposed framework provides a unified view encompassing many existing federated learning (FL) and FMTL approaches. Furthermore, we prove that our proposed algorithm, Sheaf-FMTL, achieves a sublinear convergence rate in line with state-of-the-art decentralized FMTL algorithms. Extensive experiments demonstrate that Sheaf-FMTL exhibits communication savings by sending significantly fewer bits compared to decentralized FMTL baselines.
Problem

Research questions and friction points this paper is trying to address.

Federated Multi-Task Learning
Data Heterogeneity
Privacy Protection
Innovation

Methods, ideas, or system contributions that make the work stand out.

Sheaf-FMTL
Decentralized Federated Multi-task Learning
Communication Efficiency
๐Ÿ”Ž Similar Papers
No similar papers found.