Split Conformal Prediction in the Function Space with Neural Operators

📅 2025-09-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Neural operators lack finite-sample coverage guarantees in infinite-dimensional function spaces. Method: This work pioneers the extension of split conformal prediction to function-valued outputs by (i) constructing a calibratable finite-dimensional proxy problem via discretization mappings, and (ii) introducing a resolution-adaptive regression correction mechanism to ensure asymptotic coverage from discrete approximations to continuous functions. Contributions/Results: We theoretically decompose the conformal radius into discretization error, calibration error, and model misspecification, enabling rigorous convergence analysis. A novel diagnostic metric is proposed to quantify autoregressive predictive degradation. Experiments demonstrate that our method significantly outperforms baselines—including Gaussian processes and Bayesian neural networks—in calibration stability, coverage accuracy, and super-resolution tasks, while exhibiting reduced coverage variability and enhanced robustness to resolution changes.

Technology Category

Application Category

📝 Abstract
Uncertainty quantification for neural operators remains an open problem in the infinite-dimensional setting due to the lack of finite-sample coverage guarantees over functional outputs. While conformal prediction offers finite-sample guarantees in finite-dimensional spaces, it does not directly extend to function-valued outputs. Existing approaches (Gaussian processes, Bayesian neural networks, and quantile-based operators) require strong distributional assumptions or yield conservative coverage. This work extends split conformal prediction to function spaces following a two step method. We first establish finite-sample coverage guarantees in a finite-dimensional space using a discretization map in the output function space. Then these guarantees are lifted to the function-space by considering the asymptotic convergence as the discretization is refined. To characterize the effect of resolution, we decompose the conformal radius into discretization, calibration, and misspecification components. This decomposition motivates a regression-based correction to transfer calibration across resolutions. Additionally, we propose two diagnostic metrics (conformal ensemble score and internal agreement) to quantify forecast degradation in autoregressive settings. Empirical results show that our method maintains calibrated coverage with less variation under resolution shifts and achieves better coverage in super-resolution tasks.
Problem

Research questions and friction points this paper is trying to address.

Extending split conformal prediction to function spaces
Providing finite-sample coverage guarantees for neural operators
Addressing uncertainty quantification in infinite-dimensional settings
Innovation

Methods, ideas, or system contributions that make the work stand out.

Extends split conformal prediction to function spaces
Decomposes conformal radius into three components
Proposes regression-based correction across resolutions
🔎 Similar Papers
No similar papers found.