An Information Theory of Finite Abstractions and their Fundamental Scalability Limits

📅 2025-12-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Finite abstractions of dynamical systems suffer from the curse of dimensionality: increased accuracy typically incurs exponential growth in abstraction size with respect to system dimension, yet the fundamental trade-off between accuracy (distortion) and complexity (rate) has long lacked rigorous theoretical characterization. Method: This paper pioneers the application of rate-distortion theory to finite abstraction of dynamical systems, formalizing abstractions as encoder-decoder pairs—where “rate” quantifies abstraction size (e.g., number of partitions or states) and “distortion” measures approximation error (e.g., trajectory deviation). Contribution/Results: We establish a generalized entropy-based complexity framework, derive principles for optimal abstraction construction, and obtain a tight lower bound on minimum achievable distortion under a given rate constraint. Theoretical analysis uncovers intrinsic scalability limits of finite abstractions; experiments on chaotic systems confirm both the tightness of the bound and the efficacy of our constructive method.

Technology Category

Application Category

📝 Abstract
Finite abstractions are discrete approximations of dynamical systems, such that the set of abstraction trajectories contains, in a formal sense, all system trajectories. There is a consensus that abstractions suffer from the curse of dimensionality: for the same ``accuracy"(how closely the abstraction represents the system), the abstraction size scales poorly with system dimensions. And, yet, after decades of research on abstractions, there are no formal results concerning their accuracy-size tradeoff. In this work, we derive a statistical, quantitative theory of abstractions'accuracy-size tradeoff and uncover fundamental limits on their scalability, through rate-distortion theory -- the branch of information theory studying lossy compression. Abstractions are viewed as encoder-decoder pairs, encoding trajectories of dynamical systems in a higher-dimensional ambient space. Rate represents abstraction size, while distortion describes abstraction accuracy, defined as the spatial average deviation between abstract trajectories and system ones. We obtain a fundamental lower bound on the minimum abstraction distortion, given the system dynamics and a threshold on abstraction size. The bound depends on the complexity of the dynamics, through generalized entropy. We demonstrate the bound's tightness on certain dynamical systems. Finally, we showcase how the developed theory can be employed to construct optimal abstractions, in terms of the size-accuracy tradeoff, through an example on a chaotic system.
Problem

Research questions and friction points this paper is trying to address.

Derives a statistical theory for accuracy-size tradeoff in finite abstractions
Uncovers fundamental scalability limits of abstractions using rate-distortion theory
Provides a lower bound on abstraction distortion based on system dynamics complexity
Innovation

Methods, ideas, or system contributions that make the work stand out.

Rate-distortion theory quantifies abstraction accuracy-size tradeoff
Generalized entropy captures dynamics complexity for scalability limits
Encoder-decoder pairs model abstractions as lossy compression of trajectories
🔎 Similar Papers
No similar papers found.
Giannis Delimpaltadakis
Giannis Delimpaltadakis
Postdoctoral Researcher, Eindhoven University of Technology
Control TheoryFormal MethodsOptimizationHybrid SystemsStochastic Systems
G
G. Gleizer
Delft Center for Systems and Control, Mechanical Engineering, Delft University of Technology