Subsampling Factorization Machine Annealing

📅 2025-08-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the high computational cost and weak solution-space exploration capability of Factorization Machine Annealing (FMA) in large-scale black-box optimization, this paper proposes Sampling-Enhanced FMA (SE-FMA). SE-FMA replaces full-dataset training with a probabilistic subdataset sampling mechanism; introduces a two-stage adaptive subset sequence—coarse-grained initial exploration followed by fine-grained refinement—to dynamically balance exploration and exploitation; and incorporates quantum annealing-inspired stochastic jumps to enhance global search robustness. Experiments demonstrate that SE-FMA maintains or improves optimization accuracy while accelerating training by 1.8–3.2× and reducing memory consumption by approximately 60%. These gains significantly improve scalability and computational efficiency, establishing SE-FMA as a novel, resource-efficient paradigm for large-scale black-box optimization.

Technology Category

Application Category

📝 Abstract
Quantum computing and machine learning are state-of-the-art technologies which have been investigated intensively in both academia and industry. The hybrid technology of these two ingredients is expected to be a powerful tool to solve complex problems in many branches of science and engineering such as combinatorial optimization problems and accelerate the creation of next-generation technologies. In this work, we develop an algorithm to solve a black-box optimization problem by improving Factorization Machine Annealing (FMA) such that the training of a machine learning model called Factorization Machine is performed not by a full dataset but by a subdataset which is sampled from a full dataset: Subsampling Factorization Machine Annealing (SFMA). According to such a probabilistic training process, the performance of FMA on exploring a solution space gets enhanced. As a result, SFMA exhibits balanced performance of exploration and exploitation which we call exploitation-exploration functionality. We conduct numerical benchmarking tests to compare the performance of SFMA with that of FMA. Consequently, SFMA certainly exhibits the exploration-exploitation functionality and outperforms FMA in speed and accuracy. In addition, the performance of SFMA can be further improved by sequentially using two subsampling datasets with different sizes such that the size of the latter dataset is substantially smaller than the former. Such a substantial reduction not only enhances the exploration performance of SFMA but also enables us to run it with correspondingly low computational cost even for a large-scale problem. These results indicate the effectiveness of SFMA in a certain class of black-box optimization problems of significant size: the potential scalability of SFMA in solving large-scale problems with correspondingly low computational cost.
Problem

Research questions and friction points this paper is trying to address.

Enhancing black-box optimization via probabilistic subsampling training
Improving exploration-exploitation balance in Factorization Machine Annealing
Reducing computational costs for large-scale optimization problems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Subsampling training for Factorization Machine Annealing
Enhanced exploration-exploitation functionality via probabilistic training
Sequential dual-subsampling for scalability and low cost
🔎 Similar Papers
No similar papers found.
Y
Yusuke Hama
Global R &D Center for Business by Quantum-AI Technology (G-QuAT), National Institute of Advanced Industrial Science and Technology (AIST), 1-1-1 Umezono, Tsukuba, Ibaraki 305-8568, Japan
Tadashi Kadowaki
Tadashi Kadowaki
Unknown affiliation