QMoE: A Quantum Mixture of Experts Framework for Scalable Quantum Neural Networks

๐Ÿ“… 2025-07-07
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
To address the limited scalability and expressive power of quantum machine learning (QML) models in the NISQ era, this paper proposes QMoEโ€”the first Quantum Mixture-of-Experts framework. QMoE employs multiple parameterized quantum circuits as โ€œexpertsโ€ and introduces a learnable quantum routing mechanism that adaptively selects and weights experts based on the input quantum state, enabling input-driven dynamic modeling. By synergistically integrating quantum superposition, entanglement, and classical optimization, QMoE supports end-to-end training within a hybrid classical-quantum paradigm. Experiments demonstrate that QMoE significantly outperforms standard quantum neural networks on quantum classification tasks, achieving superior expressivity, generalization capability, and linear scalability with respect to the number of experts. Moreover, the modular expert structure enhances model interpretability by exposing input-dependent expert activation patterns.

Technology Category

Application Category

๐Ÿ“ Abstract
Quantum machine learning (QML) has emerged as a promising direction in the noisy intermediate-scale quantum (NISQ) era, offering computational and memory advantages by harnessing superposition and entanglement. However, QML models often face challenges in scalability and expressiveness due to hardware constraints. In this paper, we propose quantum mixture of experts (QMoE), a novel quantum architecture that integrates the mixture of experts (MoE) paradigm into the QML setting. QMoE comprises multiple parameterized quantum circuits serving as expert models, along with a learnable quantum routing mechanism that selects and aggregates specialized quantum experts per input. The empirical results from the proposed QMoE on quantum classification tasks demonstrate that it consistently outperforms standard quantum neural networks, highlighting its effectiveness in learning complex data patterns. Our work paves the way for scalable and interpretable quantum learning frameworks.
Problem

Research questions and friction points this paper is trying to address.

Addresses scalability challenges in quantum machine learning models
Enhances expressiveness of quantum neural networks via expert integration
Improves quantum classification performance with specialized routing mechanisms
Innovation

Methods, ideas, or system contributions that make the work stand out.

Quantum mixture of experts (QMoE) architecture
Learnable quantum routing mechanism
Multiple parameterized quantum circuits
๐Ÿ”Ž Similar Papers
No similar papers found.
H
Hoang-Quan Nguyen
Department of Electrical Engineering and Computer Science, University of Arkansas, AR
X
Xuan-Bac Nguyen
Department of Electrical Engineering and Computer Science, University of Arkansas, AR
S
Sankalp Pandey
Department of Electrical Engineering and Computer Science, University of Arkansas, AR
S
Samee U. Khan
Department of Electrical and Computer Engineering, Kansas State University, KS
Ilya Safro
Ilya Safro
Professor of Computer Science, University of Delaware
Quantum ComputingGraph AlgorithmsOptimizationArtificial Intelligence
Khoa Luu
Khoa Luu
EECS Department, University of Arkansas
Smart HealthBiometricsAutonomous DrivingQuantum Machine LearningPrecision Agriculture