Making Conformal Predictors Robust in Healthcare Settings: a Case Study on EEG Classification

📅 2026-02-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the significant drop in coverage commonly observed in standard conformal prediction methods under patient distribution shifts in clinical prediction settings. To tackle this challenge, the authors propose a personalized calibration strategy tailored for EEG-based seizure classification, effectively handling both distributional shift and label uncertainty. Built upon the conformal prediction framework and integrated into the open-source PyHealth medical AI platform, the method achieves a remarkable improvement of over 20 percentage points in empirical coverage while incurring almost no increase in prediction set size. This advancement substantially enhances the robustness and practical utility of conformal prediction in real-world clinical applications.

Technology Category

Application Category

📝 Abstract
Quantifying uncertainty in clinical predictions is critical for high-stakes diagnosis tasks. Conformal prediction offers a principled approach by providing prediction sets with theoretical coverage guarantees. However, in practice, patient distribution shifts violate the i.i.d. assumptions underlying standard conformal methods, leading to poor coverage in healthcare settings. In this work, we evaluate several conformal prediction approaches on EEG seizure classification, a task with known distribution shift challenges and label uncertainty. We demonstrate that personalized calibration strategies can improve coverage by over 20 percentage points while maintaining comparable prediction set sizes. Our implementation is available via PyHealth, an open-source healthcare AI framework: https://github.com/sunlabuiuc/PyHealth.
Problem

Research questions and friction points this paper is trying to address.

conformal prediction
distribution shift
healthcare
uncertainty quantification
EEG classification
Innovation

Methods, ideas, or system contributions that make the work stand out.

conformal prediction
distribution shift
personalized calibration
EEG classification
uncertainty quantification
🔎 Similar Papers
No similar papers found.
A
Arjun Chatterjee
University of Illinois Urbana-Champaign, Urbana, IL 61801, USA; PyHealth
S
Sayeed Sajjad Razin
PyHealth; Bangladesh University of Engineering and Technology
J
John Wu
University of Illinois Urbana-Champaign, Urbana, IL 61801, USA; PyHealth
S
Siddhartha Laghuvarapu
University of Illinois Urbana-Champaign, Urbana, IL 61801, USA; PyHealth
Jathurshan Pradeepkumar
Jathurshan Pradeepkumar
PhD Student at University of Illinois Urbana-Champaign
AI for HealthcareDeep LearningEEG AnalysisBiosignal Processing
Jimeng Sun
Jimeng Sun
Professor at University of Illinois Urbana-Champaign
AI for healthcareMachine learning for healthcaredeep learning for healthcare