🤖 AI Summary
Global reliability guarantees for abstract interpreters have long constituted a core challenge in abstract interpretation. This paper pioneers the use of large language models (LLMs) to automatically synthesize high-precision, formally correct abstract interpreters for neural network verification. We propose a unified framework that quantifies semantic inconsistency via a mathematically rigorous cost function, integrating LLM-based generation, syntactic constraint checking, formal semantic verification, and cost-guided iterative refinement. The method supports automatic synthesis of sound abstraction transformations across diverse abstract domains and nonlinear operators—including composite ReLU and Sigmoid activations—jointly optimizing precision while enforcing both syntactic and semantic correctness. Experimental evaluation demonstrates that the synthesized interpreters not only reproduce but surpass handcrafted designs in quality, and further uncover novel, high-precision, provably reliable abstractions previously unreported in the literature.
📝 Abstract
Constructing abstract interpreters that provide global soundness guarantees remains a major obstacle in abstract interpretation. We investigate whether modern LLMs can reduce this burden by leveraging them to synthesize sound, non-trivial abstract interpreters across multiple abstract domains in the setting of neural network verification. We formulate synthesis as a constrained optimization problem and introduce a novel mathematically grounded cost function for measuring unsoundness under strict syntactic and semantic constraints. Based on this formulation, we develop a unified framework that unifies LLM-based generation with syntactic and semantic validation and a quantitative cost-guided feedback mechanism. Empirical results demonstrate that our framework not only matches the quality of handcrafted transformers, but more importantly, discovers sound, high-precision transformers for complex nonlinear operators that are absent from existing literature.