Trainability-Oriented Hybrid Quantum Regression via Geometric Preconditioning and Curriculum Optimization

📅 2026-01-17
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenges of training quantum neural networks for regression tasks, which are often hindered by noisy gradients and ill-conditioned optimization landscapes. To overcome these issues, the authors propose a hybrid quantum-classical regression framework that employs a learnable classical geometric preconditioner to refine input representations and integrates a curriculum learning strategy to progressively increase both the depth of the quantum circuit and the precision of optimization. The approach combines lightweight classical embedding, variational quantum circuits, and a hybrid optimizer blending SPSA and Adam. Evaluated on PDE-guided regression tasks and standard datasets, the method significantly outperforms purely quantum models, demonstrating enhanced convergence stability, substantially reduced structured error, and improved trainability and robustness of quantum regression.

Technology Category

Application Category

📝 Abstract
Quantum neural networks (QNNs) have attracted growing interest for scientific machine learning, yet in regression settings they often suffer from limited trainability under noisy gradients and ill-conditioned optimization. We propose a hybrid quantum-classical regression framework designed to mitigate these bottlenecks. Our model prepends a lightweight classical embedding that acts as a learnable geometric preconditioner, reshaping the input representation to better condition a downstream variational quantum circuit. Building on this architecture, we introduce a curriculum optimization protocol that progressively increases circuit depth and transitions from SPSA-based stochastic exploration to Adam-based gradient fine-tuning. We evaluate the approach on PDE-informed regression benchmarks and standard regression datasets under a fixed training budget in a simulator setting. Empirically, the proposed framework consistently improves over pure QNN baselines and yields more stable convergence in data-limited regimes. We further observe reduced structured errors that are visually correlated with oscillatory components on several scientific benchmarks, suggesting that geometric preconditioning combined with curriculum training is a practical approach for stabilizing quantum regression.
Problem

Research questions and friction points this paper is trying to address.

quantum neural networks
trainability
regression
noisy gradients
ill-conditioned optimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

geometric preconditioning
curriculum optimization
hybrid quantum-classical regression
quantum neural networks
trainability
🔎 Similar Papers
No similar papers found.
Qingyu Meng
Qingyu Meng
Unversity of Utah
Parallel Computing
Y
Yangshuai Wang
Department of Mathematics, National University of Singapore, 10 Lower Kent Ridge Road, Singapore.