Learning Linearized Models from Nonlinear Systems under Initialization Constraints with Finite Data

📅 2025-05-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the problem of locally linearizing nonlinear systems within experimentally constrained initial-state regions, moving beyond conventional linear assumptions and reliance on a single long trajectory. We propose a finite-sample identification framework that integrates multi-trajectory deterministic sampling with regularized least squares, and establish—for the first time—an explicit error bound quantifying the trade-off between nonlinear approximation error and measurement noise. Theoretically, we prove that the estimated linearization model converges consistently under finite data. Numerical experiments demonstrate that classical i.i.d. single-trajectory excitation methods suffer significant failure risks in nonlinear settings, whereas our approach remains robust and effective. Key contributions are: (1) a localized linearization modeling paradigm tailored to nonlinear systems; (2) a synergistic design of multi-trajectory deterministic sampling and regularized estimation; and (3) the first finite-sample theoretical analysis jointly accounting for both nonlinear model mismatch and statistical estimation error.

Technology Category

Application Category

📝 Abstract
The identification of a linear system model from data has wide applications in control theory. The existing work that provides finite sample guarantees for linear system identification typically uses data from a single long system trajectory under i.i.d. random inputs, and assumes that the underlying dynamics is truly linear. In contrast, we consider the problem of identifying a linearized model when the true underlying dynamics is nonlinear, given that there is a certain constraint on the region where one can initialize the experiments. We provide a multiple trajectories-based deterministic data acquisition algorithm followed by a regularized least squares algorithm, and provide a finite sample error bound on the learned linearized dynamics. Our error bound shows that one can consistently learn the linearized dynamics, and demonstrates a trade-off between the error due to nonlinearity and the error due to noise. We validate our results through numerical experiments, where we also show the potential insufficiency of linear system identification using a single trajectory with i.i.d. random inputs, when nonlinearity does exist.
Problem

Research questions and friction points this paper is trying to address.

Identify linearized models from nonlinear systems with initialization constraints
Provide finite sample error bounds for learned linearized dynamics
Analyze trade-off between nonlinearity error and noise error
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multiple trajectories-based deterministic data acquisition
Regularized least squares algorithm for linearized dynamics
Finite sample error bound with nonlinearity-noise trade-off
🔎 Similar Papers
No similar papers found.
Lei Xin
Lei Xin
The Chinese University of Hong Kong
Machine LearningSystem IdentificationOptimization
B
B. She
School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA, 30318, USA
Q
Qi Dou
Department of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong
G
George Chiu
School of Mechanical Engineering, Purdue University, West Lafayette, IN 47907, USA
Shreyas Sundaram
Shreyas Sundaram
Marie Gordon Professor of Electrical and Computer Engineering, Purdue University
Multi-Agent SystemsNetwork ScienceControl SystemsResilience and SecurityGame Theory