Spectral Truncation Kernels: Noncommutativity in C*-algebraic Kernel Machines

๐Ÿ“… 2024-05-28
๐Ÿ›๏ธ arXiv.org
๐Ÿ“ˆ Citations: 1
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This paper addresses the dual challenges of kernel selection difficulty and high computational cost in vector-valued reproducing kernel Hilbert space (RKHS) learning. Methodologically, it introduces a novel class of spectral-truncation-based $C^*$-algebra-valued kernels, the first to explicitly incorporate multiplicative noncommutativity of the output space into kernel designโ€”thereby overcoming the modeling limitations of conventional separable or commutative kernels. Positive definiteness is ensured via spectral truncation, and a depth-wise extension framework is established. Theoretically, the proposed kernel class achieves a superior trade-off between representation capacity and computational complexity. Empirically, it demonstrates significant improvements in generalization performance on multi-output regression and functional learning tasks, while substantially reducing computational overhead for large-scale vector-valued kernel operations.

Technology Category

Application Category

๐Ÿ“ Abstract
$C^*$-algebra-valued kernels could pave the way for the next generation of kernel machines. To further our fundamental understanding of learning with $C^*$-algebraic kernels, we propose a new class of positive definite kernels based on the spectral truncation. We focus on kernels whose inputs and outputs are vectors or functions and generalize typical kernels by introducing the noncommutativity of the products appearing in the kernels. The noncommutativity induces interactions along the data function domain. We show that it is a governing factor leading to performance enhancement: we can balance the representation power and the model complexity. We also propose a deep learning perspective to increase the representation capacity of spectral truncation kernels. The flexibility of the proposed class of kernels allows us to go beyond previous commutative kernels, addressing two of the foremost issues regarding learning in vector-valued RKHSs, namely the choice of the kernel and the computational cost.
Problem

Research questions and friction points this paper is trying to address.

Develops $C^*$-algebra-valued kernels for advanced kernel machines.
Introduces noncommutative kernels to enhance data function interactions.
Addresses kernel choice and computational cost in vector-valued RKHSs.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Spectral truncation creates new positive definite kernels.
Noncommutativity introduces interactions in data functions.
Deep learning enhances kernel flexibility and reduces costs.
๐Ÿ”Ž Similar Papers
No similar papers found.