๐ค AI Summary
Critical systems exhibit scale-invariant correlations (SIC), causing severe critical slowing-down in conventional Monte Carlo methods and drastically reducing sampling efficiency. To address this, we propose the Renormalization-inspired Generative Critical Sampler (RiGCS), the first framework to leverage SIC as a modeling prior rather than a sampling obstacle. Inspired by the renormalization group, RiGCS constructs a hierarchical generative architecture that progressively synthesizes fine-grained lattice configurations from coarse-grained representations. It integrates multilevel Monte Carlo with heat-bath updates and employs conditional autoregressive or normalizing flow models to capture residual long-range and higher-order correlations. Unlike standard conditionally independent samplers, RiGCS explicitly encodes scale-invariant structure. On the 128ร128 two-dimensional Ising model at criticality, RiGCS achieves an effective sample size several orders of magnitude larger than state-of-the-art generative baselines, substantially mitigating critical slowing-down and enabling high-precision phase-transition analysis.
๐ Abstract
Investigating critical phenomena or phase transitions is of high interest in physics and chemistry, for which Monte Carlo (MC) simulations, a crucial tool for numerically analyzing macroscopic properties of given systems, are often hindered by an emerging divergence of correlation length -- known as scale invariance at criticality (SIC) in the renormalization group theory. SIC causes the system to behave the same at any length scale, from which many existing sampling methods suffer: long-range correlations cause critical slowing down in Markov chain Monte Carlo (MCMC), and require intractably large receptive fields for generative samplers. In this paper, we propose a Renormalization-informed Generative Critical Sampler (RiGCS) -- a novel sampler specialized for near-critical systems, where SIC is leveraged as an advantage rather than a nuisance. Specifically, RiGCS builds on MultiLevel Monte Carlo (MLMC) with Heat Bath (HB) algorithms, which perform ancestral sampling from low-resolution to high-resolution lattice configurations with site-wise-independent conditional HB sampling. Although MLMC-HB is highly efficient under exact SIC, it suffers from a low acceptance rate under slight SIC violation. Notably, SIC violation always occurs in finite-size systems, and may induce long-range and higher-order interactions in the renormalized distributions, which are not considered by independent HB samplers. RiGCS enhances MLMC-HB by replacing a part of the conditional HB sampler with generative models that capture those residual interactions and improve the sampling efficiency. Our experiments show that the effective sample size of RiGCS is a few orders of magnitude higher than state-of-the-art generative model baselines in sampling configurations for 128x128 two-dimensional Ising systems.