Optimality of Gradient-MUSIC for Spectral Estimation

πŸ“… 2025-04-09
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This paper addresses the joint estimation of frequencies and amplitudes for non-harmonic Fourier sums under noisy observations. We propose Gradient-MUSIC: a novel algorithm that reformulates the classical MUSIC spectrum as a differentiable nonconvex optimization objective and employs coarse-grid thresholding for initialization, ensuring exponential convergence of gradient descent to the global minimum. Theoretically, Gradient-MUSIC achieves the minimax optimal rate under deterministic β„“^p perturbations and provides the first rigorous finite-sample convergence guarantee for MUSIC-type methods. Under the resolution condition (mDelta geq 8pi), the frequency and amplitude estimation errors are (O(varepsilon/m)) and (O(varepsilon)), respectively, for β„“^∞-bounded noiseβ€”matching the information-theoretic lower bounds. Computationally, Gradient-MUSIC avoids exhaustive grid search, yielding significantly lower complexity than conventional MUSIC, especially in low-noise regimes.

Technology Category

Application Category

πŸ“ Abstract
The goal of spectral estimation is to estimate the frequencies and amplitudes of a nonharmonic Fourier sum given noisy time samples. This paper introduces the Gradient-MUSIC algorithm, which is a novel nonconvex optimization reformulation of the classical MUSIC algorithm. Under the assumption that $mDeltageq 8pi$, where $pi/m$ is the Nyquist rate and $Delta$ is the minimum separation of the frequencies normalized to be in $[0,2pi)$, we provide a thorough geometric analysis of the objective functions generated by the algorithm. Gradient-MUSIC thresholds the objective function on a set that is as coarse as possible and locates a set of suitable initialization for gradient descent. Although the objective function is nonconvex, gradient descent converges exponentially fast to the desired local minima, which are the estimated frequencies of the signal. For deterministic $ell^p$ perturbations and any $pin [1,infty]$, Gradient-MUSIC estimates the frequencies and amplitudes at the minimax optimal rate in terms of the noise level and $m$. For example, if the noise has $ell^infty$ norm at most $epsilon$, then the frequencies and amplitudes are recovered up to error at most $Cepsilon/m$ and $Cepsilon$, respectively, which are optimal in $epsilon$ and $m$. Aside from logarithmic factors, Gradient-MUSIC is optimal for white noise and matches the rate achieved by nonlinear least squares for various families of nonstationary independent Gaussian noise. Our results show that classical MUSIC is equally optimal, but it requires an expensive search on a thin grid, whereas Gradient-MUSIC is always computationally more efficient, especially for small noise. As a consequence of this paper, for sufficiently well separated frequencies, both Gradient-MUSIC and classical MUSIC are the first provably optimal and computationally tractable algorithms for deterministic $ell^p$ perturbations.
Problem

Research questions and friction points this paper is trying to address.

Estimating frequencies and amplitudes from noisy nonharmonic Fourier sums
Analyzing Gradient-MUSIC's nonconvex optimization for spectral estimation
Proving optimality and efficiency of Gradient-MUSIC versus classical MUSIC
Innovation

Methods, ideas, or system contributions that make the work stand out.

Nonconvex optimization reformulation of MUSIC
Coarse thresholding for gradient descent initialization
Exponential convergence to local minima
πŸ”Ž Similar Papers
No similar papers found.