Iteratively reweighted kernel machines efficiently learn sparse functions

๐Ÿ“… 2025-05-13
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses the limited representational capacity of kernel methods for learning high-dimensional sparse functions. We propose a derivative-driven iterative reweighted kernel learning framework: it computes partial derivatives of the kernel predictor with respect to each coordinate to identify influential variables, then dynamically reweights samples and retrains the modelโ€”enabling adaptive discovery of low-dimensional structure and hierarchical polynomials. To our knowledge, this is the first integration of derivative-based sensitivity analysis and iterative reweighting into classical kernel methods, achieving hierarchical structure learning under bounded jump complexity. Theoretically, we prove that the method accurately detects active coordinates with low sample complexity. Empirically, it significantly improves generalization over standard kernel methods on hierarchical polynomial modeling tasks, while preserving strong interpretability and computational efficiency.

Technology Category

Application Category

๐Ÿ“ Abstract
The impressive practical performance of neural networks is often attributed to their ability to learn low-dimensional data representations and hierarchical structure directly from data. In this work, we argue that these two phenomena are not unique to neural networks, and can be elicited from classical kernel methods. Namely, we show that the derivative of the kernel predictor can detect the influential coordinates with low sample complexity. Moreover, by iteratively using the derivatives to reweight the data and retrain kernel machines, one is able to efficiently learn hierarchical polynomials with finite leap complexity. Numerical experiments illustrate the developed theory.
Problem

Research questions and friction points this paper is trying to address.

Detect influential coordinates using kernel predictor derivatives
Learn hierarchical polynomials via iterative reweighting
Achieve low sample complexity with kernel methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Iteratively reweighted kernel machines for sparsity
Kernel derivatives detect influential coordinates efficiently
Hierarchical polynomials learned via iterative reweighting
๐Ÿ”Ž Similar Papers