Functional Neural Wavefunction Optimization

📅 2025-07-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the challenges of slow convergence and difficulty in optimizing neural-network wavefunctions in variational quantum Monte Carlo (VQMC). We propose a unified optimization framework grounded in the geometric structure of function spaces. Our core innovation is a Galerkin projection that orthogonally projects the infinite-dimensional functional gradient flow onto the parameterized tangent space, yielding a geometrically self-consistent iterative algorithm. This framework unifies existing methods—including stochastic reconfiguration and Rayleigh–Gauss–Newton—and provides geometric justification for hyperparameters such as learning rate. Numerical experiments on prototypical condensed-matter models—including the Heisenberg and Hubbard models—demonstrate that the proposed algorithm significantly improves both accuracy in ground-state energy estimation and optimization stability, thereby validating its theoretical rigor and practical efficacy.

Technology Category

Application Category

📝 Abstract
We propose a framework for the design and analysis of optimization algorithms in variational quantum Monte Carlo, drawing on geometric insights into the corresponding function space. The framework translates infinite-dimensional optimization dynamics into tractable parameter-space algorithms through a Galerkin projection onto the tangent space of the variational ansatz. This perspective unifies existing methods such as stochastic reconfiguration and Rayleigh-Gauss-Newton, provides connections to classic function-space algorithms, and motivates the derivation of novel algorithms with geometrically principled hyperparameter choices. We validate our framework with numerical experiments demonstrating its practical relevance through the accurate estimation of ground-state energies for several prototypical models in condensed matter physics modeled with neural network wavefunctions.
Problem

Research questions and friction points this paper is trying to address.

Optimizing neural wavefunctions for quantum Monte Carlo
Translating infinite-dimensional dynamics to parameter-space algorithms
Estimating ground-state energies in condensed matter models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Galerkin projection for optimization dynamics
Unifies stochastic reconfiguration and Rayleigh-Gauss-Newton
Geometrically principled hyperparameter choices
🔎 Similar Papers
No similar papers found.