🤖 AI Summary
Finite abstractions of dynamical systems suffer from the curse of dimensionality: increased accuracy typically incurs exponential growth in abstraction size with respect to system dimension, yet the fundamental trade-off between accuracy (distortion) and complexity (rate) has long lacked rigorous theoretical characterization.
Method: This paper pioneers the application of rate-distortion theory to finite abstraction of dynamical systems, formalizing abstractions as encoder-decoder pairs—where “rate” quantifies abstraction size (e.g., number of partitions or states) and “distortion” measures approximation error (e.g., trajectory deviation).
Contribution/Results: We establish a generalized entropy-based complexity framework, derive principles for optimal abstraction construction, and obtain a tight lower bound on minimum achievable distortion under a given rate constraint. Theoretical analysis uncovers intrinsic scalability limits of finite abstractions; experiments on chaotic systems confirm both the tightness of the bound and the efficacy of our constructive method.
📝 Abstract
Finite abstractions are discrete approximations of dynamical systems, such that the set of abstraction trajectories contains, in a formal sense, all system trajectories. There is a consensus that abstractions suffer from the curse of dimensionality: for the same ``accuracy"(how closely the abstraction represents the system), the abstraction size scales poorly with system dimensions. And, yet, after decades of research on abstractions, there are no formal results concerning their accuracy-size tradeoff. In this work, we derive a statistical, quantitative theory of abstractions'accuracy-size tradeoff and uncover fundamental limits on their scalability, through rate-distortion theory -- the branch of information theory studying lossy compression. Abstractions are viewed as encoder-decoder pairs, encoding trajectories of dynamical systems in a higher-dimensional ambient space. Rate represents abstraction size, while distortion describes abstraction accuracy, defined as the spatial average deviation between abstract trajectories and system ones. We obtain a fundamental lower bound on the minimum abstraction distortion, given the system dynamics and a threshold on abstraction size. The bound depends on the complexity of the dynamics, through generalized entropy. We demonstrate the bound's tightness on certain dynamical systems. Finally, we showcase how the developed theory can be employed to construct optimal abstractions, in terms of the size-accuracy tradeoff, through an example on a chaotic system.