WSBD: Freezing-Based Optimizer for Quantum Neural Networks

📅 2026-02-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Training quantum neural networks is often hindered by high computational costs and the barren plateau problem. To address these challenges, this work proposes the Weighted Stochastic Block Descent (WSBD) optimizer, which innovatively integrates gradient-derived importance scoring, a parameter-level dynamic freezing mechanism, and stochastic block-wise optimization. This approach significantly reduces the number of forward passes while preserving model expressivity. Theoretical analysis provides formal convergence guarantees, and empirical results demonstrate that WSBD achieves an average 63.9% faster convergence than Adam on ground-state energy estimation tasks, with performance gains amplifying as network size increases.

Technology Category

Application Category

📝 Abstract
The training of Quantum Neural Networks (QNNs) is hindered by the high computational cost of gradient estimation and the barren plateau problem, where optimization landscapes become intractably flat. To address these challenges, we introduce Weighted Stochastic Block Descent (WSBD), a novel optimizer with a dynamic, parameter-wise freezing strategy. WSBD intelligently focuses computational resources by identifying and temporarily freezing less influential parameters based on a gradient-derived importance score. This approach significantly reduces the number of forward passes required per training step and helps navigate the optimization landscape more effectively. Unlike pruning or layer-wise freezing, WSBD maintains full expressive capacity while adapting throughout training. Our extensive evaluation shows that WSBD converges on average 63.9% faster than Adam for the popular ground-state-energy problem, an advantage that grows with QNN size. We provide a formal convergence proof for WSBD and show that parameter-wise freezing outperforms traditional layer-wise approaches in QNNs. Project page: https://github.com/Damrl-lab/WSBD-Stochastic-Freezing-Optimizer.
Problem

Research questions and friction points this paper is trying to address.

Quantum Neural Networks
barren plateau
gradient estimation
optimization landscape
computational cost
Innovation

Methods, ideas, or system contributions that make the work stand out.

Quantum Neural Networks
Parameter-wise Freezing
Barren Plateau
Gradient Estimation
Stochastic Optimization
🔎 Similar Papers
No similar papers found.