🤖 AI Summary
To address the challenge of precisely satisfying syntactic and semantic constraints during generation with large language models (LLMs), this paper proposes a controllable text generation framework based on Sequential Monte Carlo (SMC). The method formalizes constraints as conditional probability distributions and integrates domain knowledge at inference time—without fine-tuning—via constraint-aware particle resampling and dynamic computational resource allocation. This work is the first to systematically introduce SMC into LLM-based controllable generation, offering theoretically grounded improvements in posterior approximation quality. Evaluated on four tasks—Python code generation, text-to-SQL parsing, goal inference, and molecular synthesis—the approach enables small open-source models to outperform both closed-source fine-tuned models and LLMs eight times larger in parameter count, while incurring negligible additional inference overhead.
📝 Abstract
A wide range of LM applications require generating text that conforms to syntactic or semantic constraints. Imposing such constraints can be naturally framed as probabilistic conditioning, but exact generation from the resulting distribution -- which can differ substantially from the LM's base distribution -- is generally intractable. In this work, we develop an architecture for controlled LM generation based on sequential Monte Carlo (SMC). Our SMC framework allows us to flexibly incorporate domain- and problem-specific constraints at inference time, and efficiently reallocate computational resources in light of new information during the course of generation. By comparing to a number of alternatives and ablations on four challenging domains -- Python code generation for data science, text-to-SQL, goal inference, and molecule synthesis -- we demonstrate that, with little overhead, our approach allows small open-source language models to outperform models over 8x larger, as well as closed-source, fine-tuned ones. In support of the probabilistic perspective, we show that these performance improvements are driven by better approximation to the posterior distribution. Our system builds on the framework of Lew et al. (2023) and integrates with its language model probabilistic programming language, giving users a simple, programmable way to apply SMC to a broad variety of controlled generation problems.