Feature maps for the Laplacian kernel and its generalizations

📅 2025-02-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Conventional Random Fourier Features (RFF) fail for non-separable Laplacian kernels and their Matérn and powered-exponential generalizations, due to their heavy-tailed and non-separable spectral distributions. Method: We propose a novel, implementable random feature mapping: for the first time, we design heavy-tailed random weight matrices with weak coupling structures tailored to these three kernel classes, integrating spectral analysis with structured sampling to achieve efficient, low-bias feature approximation—without explicitly computing high-dimensional kernel matrices. Contribution/Results: Our method preserves theoretical approximation accuracy while substantially reducing computational cost. Empirical evaluation across multiple real-world datasets demonstrates that it matches the predictive performance of exact kernel methods on classification and regression tasks, yet accelerates training by one to two orders of magnitude—effectively overcoming the longstanding modeling bottleneck for non-separable, heavy-tailed kernel functions.

Technology Category

Application Category

📝 Abstract
Recent applications of kernel methods in machine learning have seen a renewed interest in the Laplacian kernel, due to its stability to the bandwidth hyperparameter in comparison to the Gaussian kernel, as well as its expressivity being equivalent to that of the neural tangent kernel of deep fully connected networks. However, unlike the Gaussian kernel, the Laplacian kernel is not separable. This poses challenges for techniques to approximate it, especially via the random Fourier features (RFF) methodology and its variants. In this work, we provide random features for the Laplacian kernel and its two generalizations: Mat'{e}rn kernel and the Exponential power kernel. We provide efficiently implementable schemes to sample weight matrices so that random features approximate these kernels. These weight matrices have a weakly coupled heavy-tailed randomness. Via numerical experiments on real datasets we demonstrate the efficacy of these random feature maps.
Problem

Research questions and friction points this paper is trying to address.

Develop random features for Laplacian kernel
Address non-separability in kernel approximation
Implement efficient sampling for weight matrices
Innovation

Methods, ideas, or system contributions that make the work stand out.

Laplacian kernel feature maps
Random Fourier features methodology
Efficient weight matrix sampling
🔎 Similar Papers
No similar papers found.
S
Sudhendu Ahir
Center for Machine Intelligence and Data Science, IIT Bombay, India
Parthe Pandit
Parthe Pandit
Thakur Family Chair Assistant Professor @ IIT Bombay
Machine learningStatisticsOptimizationSignal processing