SpectraLDS: Provable Distillation for Linear Dynamical Systems

📅 2025-05-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the provably efficient identification of symmetric linear dynamical systems (LDS), proposing the first end-to-end convex optimization framework whose computational complexity is independent of both state dimension and effective memory length. The method models the LDS as an invertible convolution under spectral transformation, enabling exact frequency-domain inversion and minimalistic inference. Leveraging a sequence distillation architecture, it achieves constant-time and constant-space complexity per token while preserving prediction accuracy. Key contributions include: (1) the first theoretically grounded LDS distillation method with formal sample- and computation-complexity guarantees; (2) the first exact, invertible reconstruction of spectral representations for LDS; and (3) significant improvements in long-sequence inference efficiency—demonstrated in language modeling—while maintaining high accuracy, robustness to distribution shift, and scalability to large-scale sequences.

Technology Category

Application Category

📝 Abstract
We present the first provable method for identifying symmetric linear dynamical systems (LDS) with accuracy guarantees that are independent of the systems' state dimension or effective memory. Our approach builds upon recent work that represents symmetric LDSs as convolutions learnable via fixed spectral transformations. We show how to invert this representation, thereby recovering an LDS model from its spectral transform and yielding an end-to-end convex optimization procedure. This distillation preserves predictive accuracy while enabling constant-time and constant-space inference per token, independent of sequence length. We evaluate our method, SpectraLDS, as a component in sequence prediction architectures and demonstrate that accuracy is preserved while inference efficiency is improved on tasks such as language modeling.
Problem

Research questions and friction points this paper is trying to address.

Identify symmetric linear dynamical systems with dimension-independent accuracy
Recover LDS models from spectral transforms via convex optimization
Enable efficient constant-time inference while preserving predictive accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Convex optimization for LDS distillation
Spectral transforms for model recovery
Constant-time inference per token
🔎 Similar Papers
No similar papers found.
D
Devan Shah
Computer Science Department, Princeton University
S
Shlomo Fortgang
Computer Science Department, Princeton University
S
Sofiia Druchyna
Computer Science Department, Princeton University
Elad Hazan
Elad Hazan
Professor at Princeton University and Director Google AI Princeton
Machine LearningMathematical Optimization