Random Controlled Differential Equations

📅 2025-12-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of simultaneously achieving strong inductive biases, training efficiency, and scalability in time-series modeling. We propose an efficient continuous-time reservoir framework that integrates random features with controlled differential equations (CDEs), mapping input paths to implicit continuous representations and requiring only linear readout-layer training. We introduce two novel architectures—Random Fourier CDE and Random Rough DE—and theoretically establish that their infinite-width limits converge to the RBF-lifted signature kernel and the rough signature kernel, respectively—thereby unifying random reservoir computing, continuous-depth models, and path signature theory for the first time. Leveraging random Fourier features, log-ODE discretization, and log-signature representations, our method achieves state-of-the-art or competitive performance across multiple time-series benchmarks, significantly outperforming explicit signature computation methods while preserving strong path-aware inductive biases, high training efficiency, and excellent scalability.

Technology Category

Application Category

📝 Abstract
We introduce a training-efficient framework for time-series learning that combines random features with controlled differential equations (CDEs). In this approach, large randomly parameterized CDEs act as continuous-time reservoirs, mapping input paths to rich representations. Only a linear readout layer is trained, resulting in fast, scalable models with strong inductive bias. Building on this foundation, we propose two variants: (i) Random Fourier CDEs (RF-CDEs): these lift the input signal using random Fourier features prior to the dynamics, providing a kernel-free approximation of RBF-enhanced sequence models; (ii) Random Rough DEs (R-RDEs): these operate directly on rough-path inputs via a log-ODE discretization, using log-signatures to capture higher-order temporal interactions while remaining stable and efficient. We prove that in the infinite-width limit, these model induces the RBF-lifted signature kernel and the rough signature kernel, respectively, offering a unified perspective on random-feature reservoirs, continuous-time deep architectures, and path-signature theory. We evaluate both models across a range of time-series benchmarks, demonstrating competitive or state-of-the-art performance. These methods provide a practical alternative to explicit signature computations, retaining their inductive bias while benefiting from the efficiency of random features.
Problem

Research questions and friction points this paper is trying to address.

Develop a training-efficient framework for time-series learning using random features and controlled differential equations.
Propose two variants: Random Fourier CDEs for kernel-free approximation and Random Rough DEs for stable rough-path inputs.
Unify random-feature reservoirs, continuous-time deep architectures, and path-signature theory with competitive performance.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Random features combined with controlled differential equations
Linear readout layer trained for fast scalable models
Two variants: RF-CDEs with Fourier features and R-RDEs with rough paths
F
Francesco Piatti
Imperial College London
Thomas Cass
Thomas Cass
Professor of Mathematics, Imperial College London
Probability Theory and Stochastic Analysis
W
William F. Turner
Imperial College London