Variational Stochastic Gradient Descent for Deep Neural Networks

📅 2024-04-09
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing adaptive optimizers (e.g., Adam) struggle to effectively model gradient uncertainty, limiting the convergence and generalization of deep neural networks. To address this, we propose Variational Stochastic Gradient Descent (VSGD), the first framework that deeply integrates adaptive gradient optimization with probabilistic modeling. Grounded in stochastic variational inference (SVI), VSGD models parameter updates as a latent-variable-driven probabilistic process and derives computationally efficient, adaptive update rules. Methodologically, VSGD establishes a rigorous theoretical connection to classical adaptive optimizers—including Adam—while naturally enabling quantification of gradient uncertainty. Empirical evaluation across two image classification benchmarks and four diverse network architectures demonstrates that VSGD consistently outperforms both SGD and Adam, achieving faster convergence and superior generalization performance.

Technology Category

Application Category

📝 Abstract
Current state-of-the-art optimizers are adaptive gradient-based optimization methods such as Adam. Recently, there has been an increasing interest in formulating gradient-based optimizers in a probabilistic framework for better modeling the uncertainty of the gradients. Here, we propose to combine both approaches, resulting in the Variational Stochastic Gradient Descent (VSGD) optimizer. We model gradient updates as a probabilistic model and utilize stochastic variational inference (SVI) to derive an efficient and effective update rule. Further, we show how our VSGD method relates to other adaptive gradient-based optimizers like Adam. Lastly, we carry out experiments on two image classification datasets and four deep neural network architectures, where we show that VSGD outperforms Adam and SGD.
Problem

Research questions and friction points this paper is trying to address.

Combining probabilistic gradient modeling with adaptive optimization methods
Developing VSGD using stochastic variational inference for efficient updates
Demonstrating VSGD's superiority over Adam and SGD in image classification
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines adaptive and probabilistic gradient optimization
Uses stochastic variational inference for updates
Outperforms Adam and SGD in experiments
Haotian Chen
Haotian Chen
University of California, Los Angeles
Political EconomyNon-market StrategyAmerican Politics
Anna Kuzina
Anna Kuzina
Senior Researcher, Qualcomm
B
Babak Esmaeili
Department of Mathematics and Computer Science, Eindhoven University of Technology, Eindhoven, the Netherlands
J
J. Tomczak
Department of Mathematics and Computer Science, Eindhoven University of Technology, Eindhoven, the Netherlands