🤖 AI Summary
Traditional polynomial chaos expansion (PCE) suffers from the “curse of dimensionality,” where the number of basis functions grows exponentially with input dimensionality, severely limiting scalability in high-dimensional uncertainty quantification. To address this, we propose DeepPCE—a novel deep learning–enhanced PCE framework that for the first time integrates probabilistic circuit architectures into the PCE formalism. DeepPCE employs differentiable, hierarchical compositions of orthogonal polynomial bases, enabling depth-aware modeling without explicit construction of high-dimensional basis functions. Crucially, it preserves rigorous statistical interpretability and supports closed-form analytical computation of key uncertainty metrics—including mean, variance, covariance, and Sobol sensitivity indices—without requiring Monte Carlo sampling. Experiments demonstrate that DeepPCE matches MLPs in predictive accuracy while substantially improving modeling efficiency and scalability under high-dimensional inputs. This work establishes a new paradigm for efficient, interpretable uncertainty quantification in physics-based simulations.
📝 Abstract
Polynomial chaos expansion (PCE) is a classical and widely used surrogate modeling technique in physical simulation and uncertainty quantification. By taking a linear combination of a set of basis polynomials - orthonormal with respect to the distribution of uncertain input parameters - PCE enables tractable inference of key statistical quantities, such as (conditional) means, variances, covariances, and Sobol sensitivity indices, which are essential for understanding the modeled system and identifying influential parameters and their interactions. As the number of basis functions grows exponentially with the number of parameters, PCE does not scale well to high-dimensional problems. We address this challenge by combining PCE with ideas from probabilistic circuits, resulting in the deep polynomial chaos expansion (DeepPCE) - a deep generalization of PCE that scales effectively to high-dimensional input spaces. DeepPCE achieves predictive performance comparable to that of multi-layer perceptrons (MLPs), while retaining PCE's ability to compute exact statistical inferences via simple forward passes.