Dimension-free Regret for Learning Asymmetric Linear Dynamical Systems

๐Ÿ“… 2025-02-10
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This paper addresses the challenge of achieving dimension-independent sublinear regret in online control of asymmetric, marginally stable linear dynamical systemsโ€”where prior regret bounds scaled polynomially with the hidden state dimension. We propose a novel basis function construction method that combines Chebyshev polynomials on the complex plane with spectral filtering, eliminating reliance on symmetry of the transition matrix or strong statistical assumptions. Theoretically, under a mild constraint that the imaginary parts of system eigenvalues are at least $1/mathrm{polylog}(T)$, our algorithm attains a regret upper bound of $ ilde{O}(T^{9/10})$. This result strictly improves upon all existing dimension-dependent optimal bounds and establishes a new paradigm for robust online learning in high-dimensional or structurally unknown dynamical systems.

Technology Category

Application Category

๐Ÿ“ Abstract
Previously, methods for learning marginally stable linear dynamical systems either required the transition matrix to be symmetric or incurred regret bounds that scale polynomially with the system's hidden dimension. In this work, we introduce a novel method that overcomes this trade-off, achieving dimension-free regret despite the presence of asymmetric matrices and marginal stability. Our method combines spectral filtering with linear predictors and employs Chebyshev polynomials in the complex plane to construct a novel spectral filtering basis. This construction guarantees sublinear regret in an online learning framework, without relying on any statistical or generative assumptions. Specifically, we prove that as long as the transition matrix has eigenvalues with complex component bounded by $1/mathrm{poly} log T$, then our method achieves regret $ ilde{O}(T^{9/10})$ when compared to the best linear dynamical predictor in hindsight.
Problem

Research questions and friction points this paper is trying to address.

Overcomes dimension-dependent regret in learning asymmetric linear systems
Achieves dimension-free regret with asymmetric transition matrices
Ensures sublinear regret using spectral filtering and Chebyshev polynomials
Innovation

Methods, ideas, or system contributions that make the work stand out.

Chebyshev polynomials spectral filtering
dimension-free regret achievement
online learning without assumptions