🤖 AI Summary
To address the computational intractability of Gaussian process operators (GPOs) in high-dimensional parameter spaces and on large-scale datasets, this paper proposes a scalable GPO framework for probabilistic operator learning of parametric partial differential equations (PDEs). Methodologically, it integrates three novel techniques: (1) local neighborhood kernel approximation coupled with sparse kernel design in parameter space; (2) structured Kronecker decomposition to reduce the complexity of covariance matrix inversion; and (3) neural-operator-guided, operator-aware mean function modeling. The framework achieves both computational scalability and high predictive fidelity. Experiments on canonical nonlinear PDEs—including Navier–Stokes, Darcy flow, and Burgers equations—demonstrate high accuracy and robustness across discretization scales. The approach significantly enhances the feasibility and precision of uncertainty quantification for large-scale physical systems.
📝 Abstract
Operator learning offers a powerful paradigm for solving parametric partial differential equations (PDEs), but scaling probabilistic neural operators such as the recently proposed Gaussian Processes Operators (GPOs) to high-dimensional, data-intensive regimes remains a significant challenge. In this work, we introduce a novel, scalable GPO, which capitalizes on sparsity, locality, and structural information through judicious kernel design. Addressing the fundamental limitation of cubic computational complexity, our method leverages nearest-neighbor-based local kernel approximations in the spatial domain, sparse kernel approximation in the parameter space, and structured Kronecker factorizations to enable tractable inference on large-scale datasets and high-dimensional input. While local approximations often introduce accuracy trade-offs due to limited kernel interactions, we overcome this by embedding operator-aware kernel structures and employing expressive, task-informed mean functions derived from neural operator architectures. Through extensive evaluations on a broad class of nonlinear PDEs - including Navier-Stokes, wave advection, Darcy flow, and Burgers'equations - we demonstrate that our framework consistently achieves high accuracy across varying discretization scales. These results underscore the potential of our approach to bridge the gap between scalability and fidelity in GPO, offering a compelling foundation for uncertainty-aware modeling in complex physical systems.