Subspace Kernel Learning on Tensor Sequences

📅 2026-03-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of efficiently modeling complex multidimensional interactions in high-order tensor data by proposing an Uncertainty-driven Kernel Tensor Learning (UKTL) framework. UKTL constructs similarity metrics through comparisons of tensor modal subspaces and introduces an uncertainty-aware subspace weighting mechanism to adaptively suppress unreliable modal components. To ensure scalability and enable end-to-end training, the method integrates Nyström kernel approximation with dynamic pivot tensors. Extensive experiments demonstrate that UKTL achieves state-of-the-art performance on the NTU-60, NTU-120, and Kinetics-Skeleton action recognition benchmarks, significantly enhancing model robustness, generalization capability, and modality-level interpretability.

Technology Category

Application Category

📝 Abstract
Learning from structured multi-way data, represented as higher-order tensors, requires capturing complex interactions across tensor modes while remaining computationally efficient. We introduce Uncertainty-driven Kernel Tensor Learning (UKTL), a novel kernel framework for $M$-mode tensors that compares mode-wise subspaces derived from tensor unfoldings, enabling expressive and robust similarity measure. To handle large-scale tensor data, we propose a scalable Nyström kernel linearization with dynamically learned pivot tensors obtained via soft $k$-means clustering. A key innovation of UKTL is its uncertainty-aware subspace weighting, which adaptively down-weights unreliable mode components based on estimated confidence, improving robustness and interpretability in comparisons between input and pivot tensors. Our framework is fully end-to-end trainable and naturally incorporates both multi-way and multi-mode interactions through structured kernel compositions. Extensive evaluations on action recognition benchmarks (NTU-60, NTU-120, Kinetics-Skeleton) show that UKTL achieves state-of-the-art performance, superior generalization, and meaningful mode-wise insights. This work establishes a principled, scalable, and interpretable kernel learning paradigm for structured multi-way and multi-modal tensor sequences.
Problem

Research questions and friction points this paper is trying to address.

tensor sequences
subspace learning
kernel methods
multi-way data
structured data
Innovation

Methods, ideas, or system contributions that make the work stand out.

subspace kernel learning
tensor sequences
uncertainty-aware weighting
Nyström approximation
end-to-end trainable kernel
🔎 Similar Papers
No similar papers found.