Scalable Random Wavelet Features: Efficient Non-Stationary Kernel Approximation with Convergence Guarantees

๐Ÿ“… 2026-02-01
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Existing scalable kernel methods struggle to effectively model nonstationary processesโ€”those exhibiting complex patterns whose statistical properties vary with input location. This work proposes the Random Wavelet Features (RWF) framework, which extends random feature methods to nonstationary settings for the first time. By sampling from wavelet families to construct explicit feature maps, RWF leverages the localization and multiresolution properties of wavelets to enable efficient and scalable approximation of nonstationary kernels. Theoretical analysis guarantees that the resulting kernels are positive definite, unbiased, and uniformly convergent. Empirical evaluations demonstrate that RWF outperforms conventional stationary random feature approaches across multiple synthetic and real-world datasets, achieving a superior trade-off between accuracy and computational efficiency compared to more complex models such as deep Gaussian processes.

Technology Category

Application Category

๐Ÿ“ Abstract
Modeling non-stationary processes, where statistical properties vary across the input domain, is a critical challenge in machine learning; yet most scalable methods rely on a simplifying assumption of stationarity. This forces a difficult trade-off: use expressive but computationally demanding models like Deep Gaussian Processes, or scalable but limited methods like Random Fourier Features (RFF). We close this gap by introducing Random Wavelet Features (RWF), a framework that constructs scalable, non-stationary kernel approximations by sampling from wavelet families. By harnessing the inherent localization and multi-resolution structure of wavelets, RWF generates an explicit feature map that captures complex, input-dependent patterns. Our framework provides a principled way to generalize RFF to the non-stationary setting and comes with a comprehensive theoretical analysis, including positive definiteness, unbiasedness, and uniform convergence guarantees. We demonstrate empirically on a range of challenging synthetic and real-world datasets that RWF outperforms stationary random features and offers a compelling accuracy-efficiency trade-off against more complex models, unlocking scalable and expressive kernel methods for a broad class of real-world non-stationary problems.
Problem

Research questions and friction points this paper is trying to address.

non-stationary processes
scalable kernel methods
random features
Gaussian Processes
machine learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Random Wavelet Features
non-stationary kernel approximation
scalable kernel methods
wavelet localization
uniform convergence guarantees
๐Ÿ”Ž Similar Papers
No similar papers found.