Kernel manifolds: nonlinear-augmentation dimensionality reduction using reproducing kernel Hilbert spaces

📅 2025-08-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the limited reconstruction accuracy of conventional linear dimensionality reduction methods, this paper proposes a nonlinear-enhanced manifold learning framework grounded in reproducing kernel Hilbert spaces (RKHS). The method extends generic linear dimensionality reduction by incorporating a learnable, optimal nonlinear correction term, which is uniformly modeled via kernel methods to capture arbitrary nonlinear structures—including polynomial and RBF kernels—and jointly optimized with the low-dimensional embedding under a least-squares criterion. Theoretically, the correction error is guaranteed to decrease monotonically with increasing latent dimensionality. Experimental results on multiple benchmark datasets demonstrate that the proposed approach significantly outperforms proper orthogonal decomposition (POD) and state-of-the-art quadratic manifold methods, achieving both superior approximation accuracy and reduced training cost.

Technology Category

Application Category

📝 Abstract
This paper generalizes recent advances on quadratic manifold (QM) dimensionality reduction by developing kernel methods-based nonlinear-augmentation dimensionality reduction. QMs, and more generally feature map-based nonlinear corrections, augment linear dimensionality reduction with a nonlinear correction term in the reconstruction map to overcome approximation accuracy limitations of purely linear approaches. While feature map-based approaches typically learn a least-squares optimal polynomial correction term, we generalize this approach by learning an optimal nonlinear correction from a user-defined reproducing kernel Hilbert space. Our approach allows one to impose arbitrary nonlinear structure on the correction term, including polynomial structure, and includes feature map and radial basis function-based corrections as special cases. Furthermore, our method has relatively low training cost and has monotonically decreasing error as the latent space dimension increases. We compare our approach to proper orthogonal decomposition and several recent QM approaches on data from several example problems.
Problem

Research questions and friction points this paper is trying to address.

Generalizing quadratic manifold dimensionality reduction with kernel methods
Learning optimal nonlinear corrections from reproducing kernel Hilbert spaces
Overcoming accuracy limitations of purely linear dimensionality reduction
Innovation

Methods, ideas, or system contributions that make the work stand out.

Kernel methods-based nonlinear dimensionality reduction
Optimal nonlinear correction from kernel Hilbert space
Low training cost with monotonically decreasing error
🔎 Similar Papers
No similar papers found.
Alejandro N. Diaz
Alejandro N. Diaz
S. Scott Collis Fellow in Data Science, Sandia National Laboratories
model reductionscientific machine learningcomputational scienceapplied mathematics
J
Jacob T. Needels
Sandia National Laboratories
I
Irina K. Tezaur
Sandia National Laboratories
P
Patrick J. Blonigan
Sandia National Laboratories