Information-theoretic limits and approximate message-passing for high-dimensional time series

📅 2025-01-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the modeling challenge of high-dimensional time series characterized by non-sparse structure, where the number of features scales proportionally with the sample size and the data contain substantial redundancy. Unlike conventional approaches relying on sparsity assumptions, we operate in a genuinely non-sparse high-dimensional regime. Method: We integrate random linear regression modeling, information-theoretic analysis, and the Vector Approximate Message Passing (VAMP) algorithm. Contribution/Results: We establish, for the first time, the information-theoretic limit in this non-sparse setting—deriving a tight single-letter characterization of the normalized mutual information between observations and latent signals. Theoretically, we prove that VAMP, despite lacking prior guarantees, achieves statistical optimality in this regime. Empirically, VAMP attains the minimum mean-square error (MMSE) limit across diverse configurations, demonstrating robust performance and near-optimal convergence to the theoretical bound—yielding the first solution for high-dimensional time-series recovery grounded in rigorous information theory and consistent with practical implementation.

Technology Category

Application Category

📝 Abstract
High-dimensional time series appear in many scientific setups, demanding a nuanced approach to model and analyze the underlying dependence structure. However, theoretical advancements so far often rely on stringent assumptions regarding the sparsity of the underlying signals. In this contribution, we expand the scope by investigating a high-dimensional time series model wherein the number of features grows proportionally to the number of sampling points, without assuming sparsity in the signal. Specifically, we consider the stochastic regression model and derive a single-letter formula for the normalized mutual information between observations and the signal. We also empirically study the vector approximate message passing (VAMP) algorithm and show that, despite a lack of theoretical guarantees, its performance for inference in our time series model is robust and often statistically optimal.
Problem

Research questions and friction points this paper is trying to address.

High-dimensional Time Series Analysis
Complex Information Extraction
Noise Reduction
Innovation

Methods, ideas, or system contributions that make the work stand out.

High-dimensional Time Series
VAMP Algorithm
Optimized Information Boundary
🔎 Similar Papers
No similar papers found.
D
D. Tieplova
The Abdus Salam International Center for Theoretical Physics, Trieste, Italy
S
Samriddha Lahiry
Department of Statistics and Data Science, National University of Singapore, Singapore
Jean Barbier
Jean Barbier
Associate Professor, International Center for Theoretical Physics
high-dimensional statisticsmachine learninginformation theoryspin glassesrandom matrices