Federated Conditional Conformal Prediction via Generative Models

📅 2025-10-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In federated learning, data non-i.i.d. nature undermines traditional conformal prediction (CP), which only guarantees marginal coverage and fails to capture input-conditional uncertainty. To address this, we propose the first federated CP framework ensuring conditional coverage. Our method employs generative models—specifically normalizing flows and diffusion models—to model local conditional distributions, enabling fine-grained uncertainty quantification over heterogeneous data. A novel federated aggregation mechanism then collaboratively calibrates distributed conformity scores, balancing local adaptivity with global consistency. Experiments across multiple real-world datasets demonstrate that our approach significantly outperforms existing federated CP methods: it maintains rigorous statistical validity (i.e., guaranteed conditional coverage under mild assumptions) while yielding prediction sets that are more input-sensitive and practically useful.

Technology Category

Application Category

📝 Abstract
Conformal Prediction (CP) provides distribution-free uncertainty quantification by constructing prediction sets that guarantee coverage of the true labels. This reliability makes CP valuable for high-stakes federated learning scenarios such as multi-center healthcare. However, standard CP assumes i.i.d. data, which is violated in federated settings where client distributions differ substantially. Existing federated CP methods address this by maintaining marginal coverage on each client, but such guarantees often fail to reflect input-conditional uncertainty. In this work, we propose Federated Conditional Conformal Prediction (Fed-CCP) via generative models, which aims for conditional coverage that adapts to local data heterogeneity. Fed-CCP leverages generative models, such as normalizing flows or diffusion models, to approximate conditional data distributions without requiring the sharing of raw data. This enables each client to locally calibrate conformal scores that reflect its unique uncertainty, while preserving global consistency through federated aggregation. Experiments on real datasets demonstrate that Fed-CCP achieves more adaptive prediction sets.
Problem

Research questions and friction points this paper is trying to address.

Addresses non-i.i.d. data in federated learning for uncertainty quantification
Ensures conditional coverage across clients with heterogeneous data distributions
Uses generative models to calibrate local uncertainty without sharing raw data
Innovation

Methods, ideas, or system contributions that make the work stand out.

Federated Conditional Conformal Prediction via generative models
Approximates conditional distributions without sharing raw data
Locally calibrates conformal scores with federated aggregation
🔎 Similar Papers
No similar papers found.
R
Rui Xu
Information Hub, AI Thrust, Hong Kong University of Science and Technology (Guangzhou)
Sihong Xie
Sihong Xie
Associate Professor at AI Thrust, Information Hub, HKUST-GZ
data miningmachine learning