Central limit theorems for the outputs of fully convolutional neural networks with time series input

📅 2026-03-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the lack of asymptotic analysis for the output behavior of fully convolutional networks (FCNs) when applied to time series inputs. Focusing on time series generated by short-memory linear processes, this work establishes the first theoretical result demonstrating that the output of an FCN equipped with global average pooling (GAP) converges in distribution to a Gaussian as the sequence length tends to infinity, thereby providing a rigorous foundation for its asymptotic normality. Building upon this insight, the authors propose a learnable global weighted pooling layer whose weights exhibit slow variation over time, effectively enhancing the representational capacity beyond conventional GAP. This advancement offers both theoretical grounding and architectural improvement for deep learning models in time series analysis.
📝 Abstract
Deep learning is widely deployed for time series learning tasks such as classification and forecasting. Despite the empirical successes, only little theory has been developed so far in the time series context. In this work, we prove that if the network inputs are generated from short-range dependent linear processes, the outputs of fully convolutional neural networks (FCNs) with global average pooling (GAP) are asymptotically Gaussian and the limit is attained if the length of the observed time series tends to infinity. The proof leverages existing tools from the theoretical time series literature. Based on our theory, we propose a generalization of the GAP layer by considering a global weighted pooling step with slowly varying, learnable coefficients.
Problem

Research questions and friction points this paper is trying to address.

central limit theorem
fully convolutional neural networks
time series
global average pooling
asymptotic Gaussianity
Innovation

Methods, ideas, or system contributions that make the work stand out.

central limit theorem
fully convolutional neural networks
time series
global average pooling
asymptotic Gaussianity
🔎 Similar Papers
No similar papers found.
A
Annika Betken
University of Twente, Faculty of Electrical Engineering, Mathematics, and Computer Science (EEMCS), Drienerlolaan 5, 7522 NB Enschede, Netherlands
G
Giorgio Micali
University of Twente, Faculty of Electrical Engineering, Mathematics, and Computer Science (EEMCS), Drienerlolaan 5, 7522 NB Enschede, Netherlands
Johannes Schmidt-Hieber
Johannes Schmidt-Hieber
University of Twente