🤖 AI Summary
This work investigates the convergence problem for extensions of first-order logic with real-valued semantics—specifically, those incorporating Lipschitz connectives and differentiable aggregate operators (e.g., averaging)—over random structures. Addressing the limitation of classical 0–1 laws, which apply only to discrete truth values, the authors integrate tools from logical semantics, probabilistic graphical models, Lipschitz analysis, and asymptotic combinatorial probability. Their method establishes the first convergence law for real-valued logics supporting differentiable aggregation and arbitrary Lipschitz composition. They prove that, under a broad class of sparse and dense random graph models, the probability of any such sentence converges as the domain size grows; the limit exists and can assume any value in [0,1]. This unifies and generalizes classical discrete convergence phenomena, transcending the binary truth-value constraint and enabling fine-grained, continuous semantic reasoning over random structures.
📝 Abstract
For many standard models of random structure, first-order logic sentences exhibit a convergence phenomenon on random inputs. The most well-known example is for random graphs with constant edge probability, where the probabilities of first-order sentences converge to 0 or 1. In other cases, such as certain ``sparse random graph'' models, the probabilities of sentences converge, although not necessarily to 0 or 1. In this work we deal with extensions of first-order logic with aggregate operators, variations of averaging. These logics will consist of real-valued terms, and we allow arbitrary Lipschitz functions to be used as ``connectives''. We show that some of the well-known convergence laws extend to this setting.