Laplace Approximation For Tensor Train Kernel Machines In System Identification

📅 2025-12-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Gaussian process regression (GPR) faces scalability bottlenecks in system identification due to the cubic complexity of inverting large kernel matrices. Method: This paper proposes Bayesian Tensor Kernel Machines (BTKM), integrating tensor-train (TT) decomposition into the GPR framework. It introduces the first systematic study of Bayesian selection mechanisms for TT cores and innovatively applies Laplace approximation for posterior inference on a single structurally invariant key core—whose identity remains unchanged regardless of TT rank or input dimension—while jointly optimizing remaining cores and hyperparameters via variational inference, eliminating reliance on cross-validation. Results: On inverse dynamics modeling tasks, BTKM achieves up to 65× training speedup, drastically reduces hyperparameter tuning cost, and retains generalization performance comparable to full-kernel GPR.

Technology Category

Application Category

📝 Abstract
To address the scalability limitations of Gaussian process (GP) regression, several approximation techniques have been proposed. One such method is based on tensor networks, which utilizes an exponential number of basis functions without incurring exponential computational cost. However, extending this model to a fully probabilistic formulation introduces several design challenges. In particular, for tensor train (TT) models, it is unclear which TT-core should be treated in a Bayesian manner. We introduce a Bayesian tensor train kernel machine that applies Laplace approximation to estimate the posterior distribution over a selected TT-core and employs variational inference (VI) for precision hyperparameters. Experiments show that core selection is largely independent of TT-ranks and feature structure, and that VI replaces cross-validation while offering up to 65x faster training. The method's effectiveness is demonstrated on an inverse dynamics problem.
Problem

Research questions and friction points this paper is trying to address.

Scalability limitations of Gaussian process regression in system identification
Design challenges in fully probabilistic tensor train kernel machines
Uncertainty in Bayesian treatment of tensor train cores selection
Innovation

Methods, ideas, or system contributions that make the work stand out.

Laplace approximation for posterior estimation in tensor train
Variational inference replaces cross-validation for hyperparameters
Bayesian tensor train kernel machine with selected TT-core
🔎 Similar Papers
No similar papers found.
A
Albert Saiapin
Delft Center for Systems and Control, TU Delft, Netherlands
Kim Batselier
Kim Batselier
Delft University of Technology
Green AISystem identificationTensorsnonlinear systemsMachine learning