Learning in Feature Spaces via Coupled Covariances: Asymmetric Kernel SVD and Nyström method

📅 2024-06-13
🏛️ International Conference on Machine Learning
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
Existing asymmetric kernel singular value decomposition (KSVD) methods rely on finite-dimensional approximations, limiting their ability to handle infinite-dimensional feature maps, and their variational objectives may be unbounded. Method: We propose the Coupled Covariance Eigenproblem (CCE) framework—the first rigorous variational formulation of KSVD in infinite-dimensional Hilbert spaces—unifying asymmetric KSVD with covariance operator theory and accommodating arbitrary non-Mercer, asymmetric kernels. We further derive an asymmetric Nyström method based on coupled adjoint eigenfunctions, overcoming classical limitations of symmetric kernel approximation or linear SVD modeling. Contribution/Results: Experiments demonstrate that our method significantly outperforms symmetric-baseline and linear-SVD approaches across multiple tasks, achieving faster training convergence and improved generalization. This work provides the first empirical validation of the practical utility of asymmetric kernel learning.

Technology Category

Application Category

📝 Abstract
In contrast with Mercer kernel-based approaches as used e.g., in Kernel Principal Component Analysis (KPCA), it was previously shown that Singular Value Decomposition (SVD) inherently relates to asymmetric kernels and Asymmetric Kernel Singular Value Decomposition (KSVD) has been proposed. However, the existing formulation to KSVD cannot work with infinite-dimensional feature mappings, the variational objective can be unbounded, and needs further numerical evaluation and exploration towards machine learning. In this work, i) we introduce a new asymmetric learning paradigm based on coupled covariance eigenproblem (CCE) through covariance operators, allowing infinite-dimensional feature maps. The solution to CCE is ultimately obtained from the SVD of the induced asymmetric kernel matrix, providing links to KSVD. ii) Starting from the integral equations corresponding to a pair of coupled adjoint eigenfunctions, we formalize the asymmetric Nystr""om method through a finite sample approximation to speed up training. iii) We provide the first empirical evaluations verifying the practical utility and benefits of KSVD and compare with methods resorting to symmetrization or linear SVD across multiple tasks.
Problem

Research questions and friction points this paper is trying to address.

Extend KSVD to infinite-dimensional feature mappings.
Formalize asymmetric Nyström method for faster training.
Empirically validate KSVD against symmetrization and linear SVD.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Asymmetric Kernel SVD for infinite-dimensional feature maps
Coupled covariance eigenproblem for learning paradigm
Asymmetric Nyström method for faster training
🔎 Similar Papers
No similar papers found.