🤖 AI Summary
This work addresses the prohibitive computational cost of classical probabilistic Richardson extrapolation in high-dimensional settings, where the curse of dimensionality leads to a super-exponential growth in the number of required simulations as tolerance parameters increase. To overcome this limitation, the authors propose a sparse probabilistic Richardson extrapolation framework that reformulates numerical computation as an extrapolation problem with respect to tolerance parameters. By incorporating a sparsity assumption, the method effectively reduces the effective dimensionality of the extrapolation task and integrates multi-fidelity simulations with optimal experimental design. The resulting approach substantially decreases the number of simulations needed in high-dimensional scenarios while preserving accuracy, offering a computationally efficient solution that is both theoretically well-founded and empirically effective, without sacrificing simplicity.
📝 Abstract
Almost every numerical task can be cast as extrapolation with respect to the fidelity or tolerance parameters of a consistent numerical method. This perspective enables probabilistic uncertainty quantification and optimal experimental design functionality to be deployed, and also unlocks the potential for the convergence of numerical methods to be accelerated. Recent work established Probabilistic Richardson Extrapolation as a proof-of-concept, demonstrating how parallel multi-fidelity simulation can be used to accelerate simulation from a whole-heart model. However, the number of simulations was required to increase super-exponentially in $d$, the number of tolerance parameters appearing in the numerical method. This paper develops a refined notion of 'extrapolation dimension', drastically reducing this simulation requirement when multiple tolerance parameters feature in the numerical method. Sparsity-exploiting methodology is developed that is simultaneously simpler and more powerful compared to earlier work, and this is accompanied by sharp theoretical guarantees and substantial empirical support.