🤖 AI Summary
This work addresses the problem of multidimensional frequency super-resolution estimation for non-harmonic signals under noisy sampling. The authors propose a signal subspace–based MUSIC functional as a non-convex optimization objective and establish a geometric landscape theory for the perturbed MUSIC function, proving it possesses a benign optimization structure. Leveraging this insight, they develop the first optimization framework with constructive global convergence guarantees. By combining coarse-threshold initialization with gradient descent, the method achieves uniform, non-asymptotic recovery guarantees under both cubic lattice and spherical continuous sampling schemes: it attains the minimax-optimal error bound under ℓ∞ noise and demonstrates super-resolution scaling performance under Gaussian noise.
📝 Abstract
We develop a multidimensional version of Gradient-MUSIC for estimating the frequencies of a nonharmonic signal from noisy samples. The guiding principle is that frequency recovery should be based only on the signal subspace determined by the data. From this viewpoint, the MUSIC functional is an economical nonconvex objective encoding the relevant information, and the problem becomes one of understanding the geometry of its perturbed landscape.
Our main contribution is a general structural theory showing that, under explicit conditions on the measurement kernel and the perturbation of the signal subspace, the perturbed MUSIC function is an admissible optimization landscape: suitable initial points can be found efficiently by coarse thresholding, gradient descent converges to the relevant local minima, and these minima obey quantitative error bounds. Thus the theory is not merely existential; it provides a constructive global optimization framework for multidimensional optimal resolution.
We verify the abstract conditions in detail for two canonical sampling geometries: discrete samples on a cube and continuous samples on a ball. In both cases we obtain uniform, nonasymptotic recovery guarantees under deterministic as well as stochastic noise. In particular, for lattice samples in a cube of side length $4m$, if the true frequencies are separated by at least $β_d/m$ and the noise has $\ell^\infty$ norm at most $\varepsilon$, then Gradient-MUSIC recovers the frequencies with error at most \[ C_d \frac{\varepsilon}{m}, \] where $C_d, β_d>0$ depend only on the dimension. This scaling is minimax optimal in $m$ and $\varepsilon$. Under stationary Gaussian noise, the error improves to \[ C_d\frac{σ\sqrt{\log(m)}}{m^{1+d/2}}. \] This is the noisy super-resolution scaling: (see paper for rest of abstract)