High Effort, Low Gain: Fundamental Limits of Active Learning for Linear Dynamical Systems

📅 2025-09-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses system identification of linear dynamical systems under a finite hypothesis class, with emphasis on how excitation properties of the input signal affect sample complexity. We introduce a novel, hypothesis-class-adapted persistent excitation condition that relaxes conventional global excitation requirements. Building upon this, we design an actively learning algorithm with rigorous theoretical guarantees and derive tight upper and lower bounds on sample complexity—both exhibiting identical dependence on key problem parameters, thereby characterizing the fundamental performance limit for this setting for the first time. Our approach integrates statistical learning theory and system identification, employing a modular analysis framework comprising lower-bound derivation, excitation-condition construction, policy optimization, and probabilistic consistency proof. Extensive simulations demonstrate the algorithm’s superior identification accuracy and sampling efficiency compared to existing methods.

Technology Category

Application Category

📝 Abstract
In this work, we consider the problem of identifying an unknown linear dynamical system given a finite hypothesis class. In particular, we analyze the effect of the excitation input on the sample complexity of identifying the true system with high probability. To this end, we present sample complexity lower bounds that capture the choice of the selected excitation input. The sample complexity lower bound gives rise to a system theoretic condition to determine the potential benefit of experiment design. Informed by the analysis of the sample complexity lower bound, we propose a persistent excitation (PE) condition tailored to the considered setting, which we then use to establish sample complexity upper bounds. Notably, the acs{PE} condition is weaker than in the case of an infinite hypothesis class and allows analyzing different excitation inputs modularly. Crucially, the lower and upper bounds share the same dependency on key problem parameters. Finally, we leverage these insights to propose an active learning algorithm that sequentially excites the system optimally with respect to the current estimate, and provide sample complexity guarantees for the presented algorithm. Concluding simulations showcase the effectiveness of the proposed algorithm.
Problem

Research questions and friction points this paper is trying to address.

Sample complexity limits for identifying linear dynamical systems
Effect of excitation input on system identification accuracy
Active learning algorithm with optimal sequential excitation design
Innovation

Methods, ideas, or system contributions that make the work stand out.

Sample complexity bounds for system identification
Persistent excitation condition for finite hypothesis
Active learning algorithm with optimal excitation
🔎 Similar Papers
No similar papers found.