Sensing-Limited Control of Noiseless Linear Systems Under Nonlinear Observations

📅 2026-01-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates the fundamental information-theoretic limits of control and perception in noiseless linear systems under nonlinear observations. By introducing the notion of average directed information rate, it characterizes the coupling between the information flow from system states to observations and the expansion rate of unstable dynamics, extending classical data-rate constraints to nonlinear observation models. Under regularity conditions such as log-concavity, and leveraging tools from differential entropy divergence and information theory, the work establishes necessary and sufficient conditions for mean-square observability and stabilizability. It further derives a lower bound on the minimal information flow required for stabilization and proves the convergence of estimation error. This work reveals the intrinsic limitations imposed by the perception layer on control performance, bridging a key theoretical gap between information-theoretic bounds and estimation accuracy.

Technology Category

Application Category

📝 Abstract
This paper investigates the fundamental information-theoretic limits for the control and sensing of noiseless linear dynamical systems subject to a broad class of nonlinear observations. We analyze the interactions between the control and sensing components by characterizing the minimum information flow required for stability. Specifically, we derive necessary conditions for mean-square observability and stabilizability, demonstrating that the average directed information rate from the state to the observations must exceed the intrinsic expansion rate of the unstable dynamics. Furthermore, to address the challenges posed by non-Gaussian distributions inherent to nonlinear observation channels, we establish sufficient conditions by imposing regularity assumptions, specifically log-concavity, on the system's probabilistic components. We show that under these conditions, the divergence of differential entropy implies the convergence of the estimation error, thereby closing the gap between information-theoretic bounds and estimation performance. By establishing these results, we unveil the fundamental performance limits imposed by the sensing layer, extending classical data-rate constraints to the more challenging regime of nonlinear observation models.
Problem

Research questions and friction points this paper is trying to address.

nonlinear observations
information-theoretic limits
mean-square observability
stabilizability
sensing-limited control
Innovation

Methods, ideas, or system contributions that make the work stand out.

nonlinear observations
information-theoretic limits
directed information
log-concavity
mean-square stabilizability
🔎 Similar Papers