QIBONN: A Quantum-Inspired Bilevel Optimizer for Neural Networks on Tabular Classification

πŸ“… 2025-11-12
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Addressing the challenges of large search spaces, strong non-convexity, and high tuning costs in neural hyperparameter optimization (HPO) for tabular data, this paper proposes QIBONN, a quantum-inspired bilevel optimization framework. Methodologically, QIBONN employs qubit-based unified encoding to jointly represent feature selection, network architecture, and regularization hyperparameters; introduces a hybrid search mechanism integrating deterministic rotation with global-attractor-guided stochastic mutation to balance exploration and exploitation; and incorporates IBM-Q–inspired single-qubit flip noise modeling to enhance robustness. Extensive experiments across 13 real-world tabular datasets demonstrate that, under identical tuning budgets, QIBONN matches or surpasses state-of-the-art tree-based models and existing classical/quantum-inspired HPO methods in predictive performance. The framework establishes a novel, efficient, and scalable paradigm for HPO in tabular learning.

Technology Category

Application Category

πŸ“ Abstract
Hyperparameter optimization (HPO) for neural networks on tabular data is critical to a wide range of applications, yet it remains challenging due to large, non-convex search spaces and the cost of exhaustive tuning. We introduce the Quantum-Inspired Bilevel Optimizer for Neural Networks (QIBONN), a bilevel framework that encodes feature selection, architectural hyperparameters, and regularization in a unified qubit-based representation. By combining deterministic quantum-inspired rotations with stochastic qubit mutations guided by a global attractor, QIBONN balances exploration and exploitation under a fixed evaluation budget. We conduct systematic experiments under single-qubit bit-flip noise (0.1%--1%) emulated by an IBM-Q backend. Results on 13 real-world datasets indicate that QIBONN is competitive with established methods, including classical tree-based methods and both classical/quantum-inspired HPO algorithms under the same tuning budget.
Problem

Research questions and friction points this paper is trying to address.

Optimizing neural network hyperparameters for tabular classification tasks
Addressing large non-convex search spaces in hyperparameter optimization
Reducing computational costs of exhaustive tuning through quantum-inspired methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Quantum-inspired bilevel optimizer for neural networks
Unified qubit representation for feature and architecture selection
Combines deterministic rotations with stochastic qubit mutations
πŸ”Ž Similar Papers
No similar papers found.
P
Pedro Chumpitaz-Flores
University of South Florida, Tampa, FL, USA
M
My Duong
University of South Florida, Tampa, FL, USA
Y
Ying Mao
Fordham University, New York, NY , USA
Kaixun Hua
Kaixun Hua
Assistant Professor, University of South Florida
Trustworthy AIClusteringGlobal Optimization