Reconstruction of frequency-localized functions from pointwise samples via least squares and deep learning

📅 2025-02-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the fundamental signal processing problem of reconstructing frequency-localized (e.g., bandlimited) functions from discrete samples. From an approximation-theoretic perspective, it unifies the analysis of two reconstruction paradigms: least-squares estimation using the Slepian orthogonal basis and deep learning via ReLU neural networks. Theoretical contributions include: (i) the first explicit theorem characterizing the bandwidth–sampling complexity trade-off for least-squares reconstruction in the Slepian basis; and (ii) the first practical existence guarantee for deep learning–based bandlimited function reconstruction, precisely quantifying the interplay among network width, training sample size, and sampling density. Numerical experiments on 1D and 2D tasks demonstrate that deep learning achieves performance comparable to—or even surpassing—that of least-squares methods. The work bridges the typical gap between theoretical feasibility and empirical performance, establishing a new data-driven paradigm for frequency-domain reconstruction.

Technology Category

Application Category

📝 Abstract
Recovering frequency-localized functions from pointwise data is a fundamental task in signal processing. We examine this problem from an approximation-theoretic perspective, focusing on least squares and deep learning-based methods. First, we establish a novel recovery theorem for least squares approximations using the Slepian basis from uniform random samples in low dimensions, explicitly tracking the dependence of the bandwidth on the sampling complexity. Building on these results, we then present a recovery guarantee for approximating bandlimited functions via deep learning from pointwise data. This result, framed as a practical existence theorem, provides conditions on the network architecture, training procedure, and data acquisition sufficient for accurate approximation. To complement our theoretical findings, we perform numerical comparisons between least squares and deep learning for approximating one- and two-dimensional functions. We conclude with a discussion of the theoretical limitations and the practical gaps between theory and implementation.
Problem

Research questions and friction points this paper is trying to address.

Reconstruct frequency-localized functions from samples
Compare least squares and deep learning methods
Analyze theoretical and practical approximation challenges
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Slepian basis for least squares
Deep learning for bandlimited functions
Compares least squares and deep learning
🔎 Similar Papers
No similar papers found.
A
A. M. Neuman
Faculty of Mathematics, University of Vienna, Vienna, Austria
A
Andres Felipe Lerma Pineda
Faculty of Mathematics, University of Vienna, Vienna, Austria
J
J. Bramburger
Department of Mathematics and Statistics, Concordia University, Montréal, QC, Canada
Simone Brugiapaglia
Simone Brugiapaglia
Associate Professor, Concordia University, Department of Mathematics and Statistics
Numerical AnalysisMathematics of Data ScienceMachine LearningComputational Mathematics