Signature Reconstruction from Randomized Signatures

📅 2025-02-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates the feasibility and fundamental limits of reconstructing path signatures from nonlinear flows governed by controlled ordinary differential equations (CDEs) driven by random vector fields. We propose a reconstruction framework based on stochastic neural vector fields. First, we prove that depth-2 stochastic vector fields can reconstruct the signature of a $d$-dimensional hidden state with exponential expressivity ($sim exp(d)$). Second, we derive necessary and sufficient algebraic conditions—termed universal linear independence—for exact signature reconstruction, unifying Lie-algebraic structure with machine learning representational capacity. Third, we establish a rigorous integration of signature theory, controlled differential equations, and stochastic dynamical systems, yielding the first theoretical foundation for path representation learning that simultaneously ensures algebraic interpretability and quantitative reconstruction guarantees.

Technology Category

Application Category

📝 Abstract
Controlled ordinary differential equations driven by continuous bounded variation curves can be considered a continuous time analogue of recurrent neural networks for the construction of expressive features of the input curves. We ask up to which extent well known signature features of such curves can be reconstructed from controlled ordinary differential equations with (untrained) random vector fields. The answer turns out to be algebraically involved, but essentially the number of signature features, which can be reconstructed from the non-linear flow of the controlled ordinary differential equation, is exponential in its hidden dimension, when the vector fields are chosen to be neural with depth two. Moreover, we characterize a general linear independence condition on arbitrary vector fields, under which the signature features up to some fixed order can always be reconstructed. Algebraically speaking this complements in a quantitative manner several well known results from the theory of Lie algebras of vector fields and puts them in a context of machine learning.
Problem

Research questions and friction points this paper is trying to address.

Reconstruct signature features from random vector fields.
Determine exponential reconstruction in hidden dimensions.
Characterize linear independence for signature reconstruction.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Random vector fields
Controlled differential equations
Exponential feature reconstruction
🔎 Similar Papers
No similar papers found.
M
Mie Gluckstad
Mathematical Institute, University of Oxford
N
Nicola Muca Cirone
Department of Mathematics, Imperial College London
Josef Teichmann
Josef Teichmann
ETH Zurich
Mathematical FinanceMachine Learning in FinanceRough Analysis