Quantum-Classical Hybrid Quantized Neural Network

📅 2025-06-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenges of training quantized neural networks (QNNs)—namely, nonlinear activations, deep-layer dependencies, and the computational intractability of large-scale quadratically constrained binary optimization (QCBO). We propose the first quantum-classical hybrid training framework integrating forward interval propagation with spline interpolation. The method formulates QNN training as a hard-constrained QCBO problem, supports arbitrary differentiable activation and loss functions, and employs a quantum conditional gradient descent algorithm optimized directly on a coherent Ising machine—eliminating the need for penalty parameter tuning. Key contributions include: (i) forward interval propagation for efficient handling of nonlinearity and inter-layer dependencies; (ii) spline interpolation enabling high-fidelity, low-bit function approximation; and (iii) native constraint optimization substantially improving solution quality and convergence speed. Evaluated on Fashion-MNIST, our 1.1-bit QNN achieves 94.95% accuracy, demonstrating both the efficacy and practicality of ultra-low-bit QNN training.

Technology Category

Application Category

📝 Abstract
Here in this work, we present a novel Quadratic Binary Optimization (QBO) model for quantized neural network training, enabling the use of arbitrary activation and loss functions through spline interpolation. We introduce Forward Interval Propagation (FIP), a method designed to tackle the challenges of non-linearity and the multi-layer composite structure in neural networks by discretizing activation functions into linear subintervals. This approach preserves the universal approximation properties of neural networks while allowing complex nonlinear functions to be optimized using quantum computers, thus broadening their applicability in artificial intelligence. We provide theoretical upper bounds on the approximation error and the number of Ising spins required, by deriving the sample complexity of the empirical risk minimization problem, from an optimization perspective. A significant challenge in solving the associated Quadratic Constrained Binary Optimization (QCBO) model on a large scale is the presence of numerous constraints. When employing the penalty method to handle these constraints, tuning a large number of penalty coefficients becomes a critical hyperparameter optimization problem, increasing computational complexity and potentially affecting solution quality. To address this, we employ the Quantum Conditional Gradient Descent (QCGD) algorithm, which leverages quantum computing to directly solve the QCBO problem. We prove the convergence of QCGD under a quantum oracle with randomness and bounded variance in objective value, as well as under limited precision constraints in the coefficient matrix. Additionally, we provide an upper bound on the Time-To-Solution for the QCBO solving process. Experimental results using a coherent Ising machine (CIM) demonstrate a 94.95% accuracy on the Fashion MNIST classification task, with only 1.1-bit precision.
Problem

Research questions and friction points this paper is trying to address.

Develop QBO model for quantized neural network training
Address non-linearity in neural networks via FIP method
Solve QCBO constraints efficiently using quantum computing
Innovation

Methods, ideas, or system contributions that make the work stand out.

Quadratic Binary Optimization for quantized neural networks
Forward Interval Propagation for non-linearity challenges
Quantum Conditional Gradient Descent for QCBO problem
🔎 Similar Papers
No similar papers found.
W
Wenxin Li
Beijing QBoson Quantum Technology Co., Ltd., Beijing 100015, China
C
Chuan Wang
School of Artificial Intelligence, Beijing Normal University, Beijing 100875, China
H
Hongdong Zhu
Beijing QBoson Quantum Technology Co., Ltd., Beijing 100015, China
Q
Qi Gao
Beijing QBoson Quantum Technology Co., Ltd., Beijing 100015, China
Y
Yin Ma
Beijing QBoson Quantum Technology Co., Ltd., Beijing 100015, China
Hai Wei
Hai Wei
Principal Scientist, Amazon
Video CompressionVideo QualityComputer VisionVideo StreamingMedical Image Processing
Kai Wen
Kai Wen
Stanford University
Quantum computation and communicationOptimizationStochastic simulation