Fast kernel methods: Sobolev, physics-informed, and additive models

📅 2025-09-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Kernel methods suffer from poor scalability to large-scale datasets due to their cubic $O(n^3)$ computational complexity. This paper introduces the first scalable kernel regression framework with $O(n log n)$ complexity, built upon the Non-Uniform Fast Fourier Transform (NUFFT). By tightly integrating Fourier-domain kernel representations with GPU-accelerated parallel computation, our method enables the first exact and memory-efficient application of NUFFT to kernel regression. The framework unifies support for Sobolev kernels, physics-informed kernels, and additive kernels, while preserving statistically optimal minimax convergence rates. It achieves minute-scale training on billion-sample datasets. Experiments demonstrate a 100–1000× speedup over conventional kernel methods without sacrificing accuracy, thereby substantially overcoming the scalability bottleneck that has long hindered practical deployment of kernel methods at scale.

Technology Category

Application Category

📝 Abstract
Kernel methods are powerful tools in statistical learning, but their cubic complexity in the sample size n limits their use on large-scale datasets. In this work, we introduce a scalable framework for kernel regression with O(n log n) complexity, fully leveraging GPU acceleration. The approach is based on a Fourier representation of kernels combined with non-uniform fast Fourier transforms (NUFFT), enabling exact, fast, and memory-efficient computations. We instantiate our framework in three settings: Sobolev kernel regression, physics-informed regression, and additive models. When known, the proposed estimators are shown to achieve minimax convergence rates, consistent with classical kernel theory. Empirical results demonstrate that our methods can process up to tens of billions of samples within minutes, providing both statistical accuracy and computational scalability. These contributions establish a flexible approach, paving the way for the routine application of kernel methods in large-scale learning tasks.
Problem

Research questions and friction points this paper is trying to address.

Scalable kernel regression for large datasets
Achieving minimax convergence rates efficiently
Enabling GPU-accelerated exact kernel computations
Innovation

Methods, ideas, or system contributions that make the work stand out.

GPU-accelerated scalable kernel regression framework
Fourier representation with NUFFT for efficient computation
Exact fast memory-efficient computations for large datasets
🔎 Similar Papers
No similar papers found.