Controlling the Flow: Stability and Convergence for Stochastic Gradient Descent with Decaying Regularization

📅 2025-05-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the minimization of convex, $L$-smooth functions over unbounded domains in separable real Hilbert spaces, with emphasis on the stable computation of the minimum-norm solution. We propose a regularized stochastic gradient descent (reg-SGD) algorithm incorporating time-varying decaying Tikhonov regularization. For the first time without assuming boundedness of iterates or parameters, we establish its strong convergence to the minimum-norm solution and derive optimal convergence rates. We uncover the dynamic stabilization mechanism induced by decaying regularization on SGD trajectories and formulate a joint optimization criterion for step sizes and regularization parameters. The theoretical framework provides a novel iterative design paradigm for ill-posed inverse problems, balancing stability and accuracy. Numerical experiments on image reconstruction and ODE inverse problems demonstrate substantial improvements in convergence behavior, robustness to noise, and solution accuracy.

Technology Category

Application Category

📝 Abstract
The present article studies the minimization of convex, L-smooth functions defined on a separable real Hilbert space. We analyze regularized stochastic gradient descent (reg-SGD), a variant of stochastic gradient descent that uses a Tikhonov regularization with time-dependent, vanishing regularization parameter. We prove strong convergence of reg-SGD to the minimum-norm solution of the original problem without additional boundedness assumptions. Moreover, we quantify the rate of convergence and optimize the interplay between step-sizes and regularization decay. Our analysis reveals how vanishing Tikhonov regularization controls the flow of SGD and yields stable learning dynamics, offering new insights into the design of iterative algorithms for convex problems, including those that arise in ill-posed inverse problems. We validate our theoretical findings through numerical experiments on image reconstruction and ODE-based inverse problems.
Problem

Research questions and friction points this paper is trying to address.

Analyzing stability and convergence of regularized stochastic gradient descent
Minimizing convex functions with vanishing Tikhonov regularization
Optimizing step-sizes and regularization decay for SGD
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Tikhonov regularization with vanishing parameter
Optimizes step-sizes and regularization decay interplay
Validated on image and ODE inverse problems
🔎 Similar Papers
No similar papers found.