🤖 AI Summary
Data-driven polynomial chaos expansion (PCE) surrogate models lack reliable predictive uncertainty quantification. Method: This paper proposes a tightly integrated interval estimation framework combining jackknife conformal prediction with PCE. Leveraging the linear structure of PCE regression, we derive closed-form analytical expressions for leave-one-out residuals and predictions—eliminating the need for repeated model retraining or an independent calibration dataset. Contribution/Results: The method achieves both computational efficiency and data economy. Extensive validation on benchmark problems confirms the statistical validity and coverage accuracy of the resulting prediction intervals. Furthermore, it quantitatively characterizes how training sample size influences interval width and reliability. Crucially, this work establishes the first retraining-free, theoretically guaranteed, plug-and-play uncertainty quantification framework for PCE—providing rigorous, distribution-free confidence intervals without sacrificing predictive fidelity.
📝 Abstract
This work introduces a method to equip data-driven polynomial chaos expansion surrogate models with intervals that quantify the predictive uncertainty of the surrogate. To that end, we integrate jackknife-based conformal prediction into regression-based polynomial chaos expansions. The jackknife algorithm uses leave-one-out residuals to generate predictive intervals around the predictions of the polynomial chaos surrogate. The jackknife+ extension additionally requires leave-one-out model predictions. The key to efficient implementation is to leverage the linearity of the polynomial chaos regression model, so that leave-one-out residuals and, if necessary, leave-one-out model predictions can be computed with analytical, closed-form expressions, thus eliminating the need for repeated model re-training. In addition to the efficient computation of the predictive intervals, a significant advantage of this approach is its data efficiency, as it requires no hold-out dataset for prediction interval calibration, thus allowing the entire dataset to be used for model training. The conformalized polynomial chaos expansion method is validated on several benchmark models, where the impact of training data volume on the predictive intervals is additionally investigated.