Probabilistic Approximate Optimization: A New Variational Monte Carlo Algorithm

📅 2025-07-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses optimization of large-scale 3D spin glasses and heavy-tailed distribution problems (e.g., the SK-Lévy model). We propose P-AOA, a novel variational Monte Carlo algorithm that employs a parameterized binary stochastic neural network, enabling derivative-free, efficient sampling via independent sample-driven updates of coupling parameters. Our key theoretical contribution is the first rigorous correspondence established between derivative-free parameter updates and full-scale Markovian flow gradients, thereby unifying simulated annealing within a variational framework and enabling joint multi-temperature schedule optimization. P-AOA is co-designed for FPGA-based probabilistic computers, integrating on-chip annealing with parametric sampling to enhance hardware efficiency and convergence speed. Experiments demonstrate that P-AOA outperforms QAOA on a 26-spin SK model and surpasses conventional simulated annealing on both 3D spin glass and SK-Lévy benchmarks.

Technology Category

Application Category

📝 Abstract
We introduce a generalized extit{Probabilistic Approximate Optimization Algorithm (PAOA)}, a classical variational Monte Carlo framework that extends and formalizes prior work by Weitz extit{et al.}~cite{Combes_2023}, enabling parameterized and fast sampling on present-day Ising machines and probabilistic computers. PAOA operates by iteratively modifying the couplings of a network of binary stochastic units, guided by cost evaluations from independent samples. We establish a direct correspondence between derivative-free updates and the gradient of the full $2^N imes 2^N$ Markov flow, showing that PAOA admits a principled variational formulation. Simulated annealing emerges as a limiting case under constrained parameterizations, and we implement this regime on an FPGA-based probabilistic computer with on-chip annealing to solve large 3D spin-glass problems. Benchmarking PAOA against QAOA on the canonical 26-spin Sherrington-Kirkpatrick model with matched parameters reveals superior performance for PAOA. We show that PAOA naturally extends simulated annealing by optimizing multiple temperature profiles, leading to improved performance over SA on heavy-tailed problems such as SK-Lévy.
Problem

Research questions and friction points this paper is trying to address.

Extends variational Monte Carlo for Ising machines
Optimizes couplings in binary stochastic networks
Improves performance over QAOA and simulated annealing
Innovation

Methods, ideas, or system contributions that make the work stand out.

Variational Monte Carlo framework for optimization
Parameterized fast sampling on Ising machines
Optimizes multiple temperature profiles
🔎 Similar Papers
No similar papers found.
A
Abdelrahman S. Abdelrahman
Department of Electrical and Computer Engineering, University of California, Santa Barbara
Shuvro Chowdhury
Shuvro Chowdhury
University of California, Santa Barbara
Probabilistic and Neuromorphic ComputingQuantum ComputingMachine LearningNanoelectronics
F
Flaviano Morone
Center for Quantum Phenomena, Department of Physics, New York University
K
Kerem Y. Camsari
Department of Electrical and Computer Engineering, University of California, Santa Barbara