🤖 AI Summary
This work studies the adaptive complexity of parallel sampling from log-concave distributions—specifically, the minimal number of sequential rounds required to achieve a prescribed accuracy, assuming polynomially many queries can be executed in parallel per round. We establish, for the first time, tight lower bounds on the number of iterations under total variation distance for both unconstrained and box-constrained settings: in the unconstrained case, exponentially small error necessitates superlinear rounds; under box constraints, even superpolynomially small error cannot be achieved by any nearly linear-round algorithm. Methodologically, we introduce a novel hardness potential framework based on chain-structured random partitions and classical smoothing techniques, which uniformly captures distributional properties including log-smoothness/Lipschitzness and strong/non-strong concavity. This is the first systematic characterization of fundamental computational bottlenecks for log-concave sampling in the parallel model.
📝 Abstract
In large-data applications, such as the inference process of diffusion models, it is desirable to design sampling algorithms with a high degree of parallelization. In this work, we study the adaptive complexity of sampling, which is the minimal number of sequential rounds required to achieve sampling given polynomially many queries executed in parallel at each round. For unconstrained sampling, we examine distributions that are log-smooth or log-Lipschitz and log strongly or non-strongly concave. We show that an almost linear iteration algorithm cannot return a sample with a specific exponentially small accuracy under total variation distance. For box-constrained sampling, we show that an almost linear iteration algorithm cannot return a sample with sup-polynomially small accuracy under total variation distance for log-concave distributions. Our proof relies upon novel analysis with the characterization of the output for the hardness potentials based on the chain-like structure with random partition and classical smoothing techniques.