🤖 AI Summary
This work addresses the significant drop in coverage commonly observed in standard conformal prediction methods under patient distribution shifts in clinical prediction settings. To tackle this challenge, the authors propose a personalized calibration strategy tailored for EEG-based seizure classification, effectively handling both distributional shift and label uncertainty. Built upon the conformal prediction framework and integrated into the open-source PyHealth medical AI platform, the method achieves a remarkable improvement of over 20 percentage points in empirical coverage while incurring almost no increase in prediction set size. This advancement substantially enhances the robustness and practical utility of conformal prediction in real-world clinical applications.
📝 Abstract
Quantifying uncertainty in clinical predictions is critical for high-stakes diagnosis tasks. Conformal prediction offers a principled approach by providing prediction sets with theoretical coverage guarantees. However, in practice, patient distribution shifts violate the i.i.d. assumptions underlying standard conformal methods, leading to poor coverage in healthcare settings. In this work, we evaluate several conformal prediction approaches on EEG seizure classification, a task with known distribution shift challenges and label uncertainty. We demonstrate that personalized calibration strategies can improve coverage by over 20 percentage points while maintaining comparable prediction set sizes. Our implementation is available via PyHealth, an open-source healthcare AI framework: https://github.com/sunlabuiuc/PyHealth.