Preconditioned Additive Gaussian Processes with Fourier Acceleration

πŸ“… 2025-04-01
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Gaussian processes (GPs) suffer from cubic computational complexity in large-scale, high-dimensional settings due to dense covariance matrix operations. To address this, we propose a matrix-free additive GP framework leveraging the non-uniform fast Fourier transform (NFFT). Our method exploits additive kernel structures to model low-order feature interactions and employs NFFT to achieve near-linear-time matrix–vector multiplication. We further design a hyperparameter-aware adaptive preconditioner that substantially accelerates conjugate gradient solvers and hyperparameter optimization. Evaluated on multiple real-world datasets, the approach achieves O(N log N) time complexity, delivers 3–5Γ— faster training over standard GPs, and matches or exceeds their predictive accuracy. This work establishes an efficient, scalable paradigm for large-scale uncertainty quantification.

Technology Category

Application Category

πŸ“ Abstract
Gaussian processes (GPs) are crucial in machine learning for quantifying uncertainty in predictions. However, their associated covariance matrices, defined by kernel functions, are typically dense and large-scale, posing significant computational challenges. This paper introduces a matrix-free method that utilizes the Non-equispaced Fast Fourier Transform (NFFT) to achieve nearly linear complexity in the multiplication of kernel matrices and their derivatives with vectors for a predetermined accuracy level. To address high-dimensional problems, we propose an additive kernel approach. Each sub-kernel in this approach captures lower-order feature interactions, allowing for the efficient application of the NFFT method and potentially increasing accuracy across various real-world datasets. Additionally, we implement a preconditioning strategy that accelerates hyperparameter tuning, further improving the efficiency and effectiveness of GPs.
Problem

Research questions and friction points this paper is trying to address.

Reducing computational complexity in Gaussian process kernel matrix operations
Enhancing efficiency for high-dimensional problems via additive kernels
Accelerating hyperparameter tuning with preconditioning strategies
Innovation

Methods, ideas, or system contributions that make the work stand out.

Matrix-free method with NFFT for linear complexity
Additive kernel approach for high-dimensional problems
Preconditioning strategy accelerates hyperparameter tuning
πŸ”Ž Similar Papers
No similar papers found.