🤖 AI Summary
This work addresses the challenge of constructing compact and reliable prediction sets for multivariate regression under finite-sample settings. Existing conformal prediction methods either rely on strong geometric assumptions or compromise computational efficiency and volume optimality. We propose a novel volume-oriented conformal prediction framework that introduces a volume-sensitive loss function, yielding adaptive nonconformity scores compatible with arbitrary norm-ball structures—including single- and mixed-norm configurations. To our knowledge, this is the first approach enabling end-to-end joint optimization of both the predictive model and the shape of the uncertainty set. The method guarantees rigorous finite-sample coverage while significantly reducing prediction set volume, and achieves superior computational efficiency compared to state-of-the-art flexible alternatives. Extensive experiments on multiple real-world datasets demonstrate its effectiveness and robustness.
📝 Abstract
Conformal prediction provides a principled framework for constructing predictive sets with finite-sample validity. While much of the focus has been on univariate response variables, existing multivariate methods either impose rigid geometric assumptions or rely on flexible but computationally expensive approaches that do not explicitly optimize prediction set volume. We propose an optimization-driven framework based on a novel loss function that directly learns minimum-volume covering sets while ensuring valid coverage. This formulation naturally induces a new nonconformity score for conformal prediction, which adapts to the residual distribution and covariates. Our approach optimizes over prediction sets defined by arbitrary norm balls, including single and multi-norm formulations. Additionally, by jointly optimizing both the predictive model and predictive uncertainty, we obtain prediction sets that are tight, informative, and computationally efficient, as demonstrated in our experiments on real-world datasets.