🤖 AI Summary
This work addresses two key limitations of Beurling-LASSO (BLASSO) for continuous sparse regularization: the intractability of verifying the local positive curvature (LPC) condition and the relatively large localization error near the true support. We propose a kernel-switching mechanism that decouples the modeling kernel from the LPC requirement, using the analytically tractable “sinc-4” kernel—which provably satisfies LPC—as a theoretical anchor to derive a universal error bound. We establish that BLASSO’s localization error converges asymptotically to zero as noise vanishes, thereby proving its first asymptotic consistency in neighborhoods of the true support. Our method integrates total variation regularization, embedding into reproducing kernel Hilbert spaces, and bandlimited smoothing/sketching techniques, achieving high recovery accuracy while substantially reducing computational complexity. The contributions include expanding the class of verifiably LPC-satisfying kernels, tightening the error bound, and enabling efficient, robust sparse measure estimation for translation-invariant mixture models.
📝 Abstract
This paper advances the general theory of continuous sparse regularisation on measures with the Beurling-LASSO (BLASSO). This TV-regularized convex program on the space of measures allows to recover a sparse measure using a noisy observation from an appropriate measurement operator. While previous works have uncovered the central role played by this operator and its associated kernel in order to get estimation error bounds, the latter requires a technical local positive curvature (LPC) assumption to be verified on a case-by-case basis. In practice, this yields only few LPC-kernels for which this condition is proved. At the heart of our contribution lies the kernel switch, which uncouples the model kernel from the LPC assumption: it enables to leverage any known LPC-kernel as a pivot kernel to prove error bounds, provided embedding conditions are verified between the model and pivot RKHS. We increment the list of LPC-kernels, proving that the "sinc-4" kernel, used for signal recovery and mixture problems, does satisfy the LPC assumption. Furthermore, we also show that the BLASSO localisation error around the true support decreases with the noise level, leading to effective near regions. This improves on known results where this error is fixed with some parameters depending on the model kernel. We illustrate the interest of our results in the case of translation-invariant mixture model estimation, using bandlimiting smoothing and sketching techniques to reduce the computational burden of BLASSO.