Sampling through Algorithmic Diffusion in non-convex Perceptron problems

📅 2025-02-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the feasibility of efficient sampling from the solution space of nonconvex perceptron models. We propose a diffusion-based sampling framework that integrates Algorithmic Stochastic Localization (ASL) with Approximate Message Passing (AMP), and establish, for the first time, an analytical characterization of the thermodynamic limit via replica theory, yielding a universal sampling feasibility criterion. Theoretically, we prove that in the entire replica-symmetric phase of the spherical perceptron, the negative stability condition ensures efficient approximate uniform sampling; in contrast, the binary perceptron is provably intractable due to an intrinsic overlap gap that prevents any efficient sampling algorithm. Furthermore, we construct a class of tractable surrogate measures, providing a novel paradigm and theoretical benchmark for sampling from nonconvex high-dimensional distributions.

Technology Category

Application Category

📝 Abstract
We analyze the problem of sampling from the solution space of simple yet non-convex neural network models by employing a denoising diffusion process known as Algorithmic Stochastic Localization, where the score function is provided by Approximate Message Passing. We introduce a formalism based on the replica method to characterize the process in the infinite-size limit in terms of a few order parameters, and, in particular, we provide criteria for the feasibility of sampling. We show that, in the case of the spherical perceptron problem with negative stability, approximate uniform sampling is achievable across the entire replica symmetric region of the phase diagram. In contrast, for the binary perceptron, uniform sampling via diffusion invariably fails due to the overlap gap property exhibited by the typical set of solutions. We discuss the first steps in defining alternative measures that can be efficiently sampled.
Problem

Research questions and friction points this paper is trying to address.

Sample non-convex neural network solution spaces
Employ Algorithmic Stochastic Localization for sampling
Characterize feasibility using replica method formalism
Innovation

Methods, ideas, or system contributions that make the work stand out.

Algorithmic Stochastic Localization
Approximate Message Passing
Replica method formalism
🔎 Similar Papers
No similar papers found.
D
Davide Straziota
Department of Computing Sciences, Bocconi University, Milan, 20136, Italy
E
Elizaveta Demyanenko
Department of Computing Sciences, Bocconi University, Milan, 20136, Italy
Carlo Baldassi
Carlo Baldassi
Bocconi University; ELLIS scholar
OptimizationStatistical MechanicsComputational BiologyComplex SystemsMachine Learning
Carlo Lucibello
Carlo Lucibello
Assistant Professor, Bocconi University
statistical physicsdisordered systemsinformation theorymachine learningdeep learning