๐ค AI Summary
This paper addresses the challenge of achieving dimension-independent sublinear regret in online control of asymmetric, marginally stable linear dynamical systemsโwhere prior regret bounds scaled polynomially with the hidden state dimension. We propose a novel basis function construction method that combines Chebyshev polynomials on the complex plane with spectral filtering, eliminating reliance on symmetry of the transition matrix or strong statistical assumptions. Theoretically, under a mild constraint that the imaginary parts of system eigenvalues are at least $1/mathrm{polylog}(T)$, our algorithm attains a regret upper bound of $ ilde{O}(T^{9/10})$. This result strictly improves upon all existing dimension-dependent optimal bounds and establishes a new paradigm for robust online learning in high-dimensional or structurally unknown dynamical systems.
๐ Abstract
Previously, methods for learning marginally stable linear dynamical systems either required the transition matrix to be symmetric or incurred regret bounds that scale polynomially with the system's hidden dimension. In this work, we introduce a novel method that overcomes this trade-off, achieving dimension-free regret despite the presence of asymmetric matrices and marginal stability. Our method combines spectral filtering with linear predictors and employs Chebyshev polynomials in the complex plane to construct a novel spectral filtering basis. This construction guarantees sublinear regret in an online learning framework, without relying on any statistical or generative assumptions. Specifically, we prove that as long as the transition matrix has eigenvalues with complex component bounded by $1/mathrm{poly} log T$, then our method achieves regret $ ilde{O}(T^{9/10})$ when compared to the best linear dynamical predictor in hindsight.