Metric-Entropy Limits on Nonlinear Dynamical System Learning

📅 2024-07-01
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the fundamental limits of sequence-to-sequence mapping approximation for nonlinear dynamical systems that are Lipschitz continuous and exhibit rapid forgetting—specifically, exponential or polynomial memory decay—whose input–output function classes vastly exceed those covered by classical approximation theory. Method: To characterize this broad function class, we introduce a novel metric entropy framework based on “order, type, and generalized dimension,” integrating functional-space asymptotic analysis with recurrent neural network (RNN) approximation theory. Contribution/Results: We establish the first metric-entropy-optimal approximation bound for RNNs on such systems. Our results demonstrate that RNNs achieve information-theoretically optimal learning for both exponential and polynomial memory-decay systems—surpassing the expressivity constraints imposed by classical deep network approximation theory. This work provides a new theoretical paradigm for understanding the fundamental limits of learning nonlinear dynamical systems.

Technology Category

Application Category

📝 Abstract
This paper is concerned with the fundamental limits of nonlinear dynamical system learning from input-output traces. Specifically, we show that recurrent neural networks (RNNs) are capable of learning nonlinear systems that satisfy a Lipschitz property and forget past inputs fast enough in a metric-entropy optimal manner. As the sets of sequence-to-sequence maps realized by the dynamical systems we consider are significantly more massive than function classes generally considered in deep neural network approximation theory, a refined metric-entropy characterization is needed, namely in terms of order, type, and generalized dimension. We compute these quantities for the classes of exponentially-decaying and polynomially-decaying Lipschitz fading-memory systems and show that RNNs can achieve them.
Problem

Research questions and friction points this paper is trying to address.

Fundamental limits on approximating nonlinear dynamical systems
RNNs optimally approximate Lipschitz systems with fading memory
Metric-entropy characterization for massive sequence-to-sequence mappings
Innovation

Methods, ideas, or system contributions that make the work stand out.

RNNs approximate Lipschitz nonlinear systems optimally
Refined metric-entropy uses order, type, dimension
RNNs achieve entropy for fading-memory systems
🔎 Similar Papers
No similar papers found.