๐ค AI Summary
This paper addresses the dual challenges of kernel selection difficulty and high computational cost in vector-valued reproducing kernel Hilbert space (RKHS) learning. Methodologically, it introduces a novel class of spectral-truncation-based $C^*$-algebra-valued kernels, the first to explicitly incorporate multiplicative noncommutativity of the output space into kernel designโthereby overcoming the modeling limitations of conventional separable or commutative kernels. Positive definiteness is ensured via spectral truncation, and a depth-wise extension framework is established. Theoretically, the proposed kernel class achieves a superior trade-off between representation capacity and computational complexity. Empirically, it demonstrates significant improvements in generalization performance on multi-output regression and functional learning tasks, while substantially reducing computational overhead for large-scale vector-valued kernel operations.
๐ Abstract
$C^*$-algebra-valued kernels could pave the way for the next generation of kernel machines. To further our fundamental understanding of learning with $C^*$-algebraic kernels, we propose a new class of positive definite kernels based on the spectral truncation. We focus on kernels whose inputs and outputs are vectors or functions and generalize typical kernels by introducing the noncommutativity of the products appearing in the kernels. The noncommutativity induces interactions along the data function domain. We show that it is a governing factor leading to performance enhancement: we can balance the representation power and the model complexity. We also propose a deep learning perspective to increase the representation capacity of spectral truncation kernels. The flexibility of the proposed class of kernels allows us to go beyond previous commutative kernels, addressing two of the foremost issues regarding learning in vector-valued RKHSs, namely the choice of the kernel and the computational cost.