π€ AI Summary
This paper addresses the joint estimation of frequencies and amplitudes for non-harmonic Fourier sums under noisy observations. We propose Gradient-MUSIC: a novel algorithm that reformulates the classical MUSIC spectrum as a differentiable nonconvex optimization objective and employs coarse-grid thresholding for initialization, ensuring exponential convergence of gradient descent to the global minimum. Theoretically, Gradient-MUSIC achieves the minimax optimal rate under deterministic β^p perturbations and provides the first rigorous finite-sample convergence guarantee for MUSIC-type methods. Under the resolution condition (mDelta geq 8pi), the frequency and amplitude estimation errors are (O(varepsilon/m)) and (O(varepsilon)), respectively, for β^β-bounded noiseβmatching the information-theoretic lower bounds. Computationally, Gradient-MUSIC avoids exhaustive grid search, yielding significantly lower complexity than conventional MUSIC, especially in low-noise regimes.
π Abstract
The goal of spectral estimation is to estimate the frequencies and amplitudes of a nonharmonic Fourier sum given noisy time samples. This paper introduces the Gradient-MUSIC algorithm, which is a novel nonconvex optimization reformulation of the classical MUSIC algorithm. Under the assumption that $mDeltageq 8pi$, where $pi/m$ is the Nyquist rate and $Delta$ is the minimum separation of the frequencies normalized to be in $[0,2pi)$, we provide a thorough geometric analysis of the objective functions generated by the algorithm. Gradient-MUSIC thresholds the objective function on a set that is as coarse as possible and locates a set of suitable initialization for gradient descent. Although the objective function is nonconvex, gradient descent converges exponentially fast to the desired local minima, which are the estimated frequencies of the signal. For deterministic $ell^p$ perturbations and any $pin [1,infty]$, Gradient-MUSIC estimates the frequencies and amplitudes at the minimax optimal rate in terms of the noise level and $m$. For example, if the noise has $ell^infty$ norm at most $epsilon$, then the frequencies and amplitudes are recovered up to error at most $Cepsilon/m$ and $Cepsilon$, respectively, which are optimal in $epsilon$ and $m$. Aside from logarithmic factors, Gradient-MUSIC is optimal for white noise and matches the rate achieved by nonlinear least squares for various families of nonstationary independent Gaussian noise. Our results show that classical MUSIC is equally optimal, but it requires an expensive search on a thin grid, whereas Gradient-MUSIC is always computationally more efficient, especially for small noise. As a consequence of this paper, for sufficiently well separated frequencies, both Gradient-MUSIC and classical MUSIC are the first provably optimal and computationally tractable algorithms for deterministic $ell^p$ perturbations.